UnicomLLM/Unichat-llama3-Chinese-8B-28K
UnicomLLM/Unichat-llama3-Chinese-8B-28K is an 8 billion parameter Llama 3-based model developed by UnicomLLM, specifically fine-tuned for high-quality Chinese question answering. This model features a native context length of 28K tokens, making it suitable for processing and understanding long Chinese texts. It was trained using full parameter fine-tuning on carefully screened, high-quality long-text instruction data covering various domains. Its primary strength lies in its ability to handle extensive Chinese content and provide accurate, detailed responses.
Loading preview...
Unichat-llama3-Chinese-8B-28K: Long-Context Chinese Llama 3 Fine-tune
UnicomLLM/Unichat-llama3-Chinese-8B-28K is the first Llama 3-based Chinese instruction-tuned model with an extended context window, developed by UnicomLLM's AI Innovation Center. Released on April 26, 2024, this model is built upon Meta's Llama 3-8B and has undergone full parameter fine-tuning (not LoRA/LongLoRA) with additional Chinese data to achieve high-quality Chinese question-answering capabilities.
Key Capabilities & Features
- Extended Context Length: Features a native context length of 28,000 tokens, significantly enhancing its ability to process and understand long documents.
- High-Quality Chinese QA: Specifically trained to excel in Chinese question-answering tasks, leveraging a robust dataset.
- Full Parameter Fine-tuning: Utilizes full parameter fine-tuning for enhanced performance and integration of Chinese language understanding.
- Curated Training Data: Trained on high-quality, long-text instruction data covering multiple domains, with data rigorously screened for optimal model training.
Use Cases
This model is particularly well-suited for applications requiring deep understanding and generation of responses from extensive Chinese texts. For example, it can accurately answer complex questions based on long articles, as demonstrated by its ability to extract detailed information from a 16,000-character excerpt of "Ming Dynasty Affairs" regarding the composition and duties of the Ming Dynasty's "Three Great Camps."
For more details, refer to the Unichat-llama3-Chinese GitHub repository.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.