Metaspectral/Tai

TEXT GENERATIONConcurrency Cost:4Model Size:69BQuant:FP8Ctx Length:32kPublished:Oct 29, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

Tai is a 69 billion parameter large language model developed by Metaspectral, based on the LLaMA-2 architecture. It was trained as a general-purpose model with a 32768-token context length, specifically designed to be helpful in answering questions related to STEM subjects.

Loading preview...

Overview

Tai is a 69 billion parameter Large Language Model (LLM) developed by Metaspectral, built upon the LLaMA-2 architecture. It is designed as a general-purpose model, with a particular focus on providing assistance and accurate information for STEM (Science, Technology, Engineering, and Mathematics) related inquiries.

Key Capabilities

  • General-purpose language understanding: Capable of processing and generating human-like text across a wide range of topics.
  • STEM subject expertise: Optimized to be helpful in answering questions and providing information related to scientific and technical fields.
  • Large context window: Features a 32768-token context length, allowing for processing and understanding longer inputs and maintaining coherence over extended conversations or documents.

Prompt Format

Tai utilizes a specific prompt format for optimal interaction:

SYSTEM:
USER:
ASSISTANT:

This structure helps in clearly delineating roles and improving response quality.