shenzhi-wang/Llama3.1-8B-Chinese-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jul 24, 2024License:llama3.1Architecture:Transformer0.3K Warm

shenzhi-wang/Llama3.1-8B-Chinese-Chat is an 8 billion parameter instruction-tuned language model developed by Shenzhi Wang and Yaowei Zheng, built upon Meta-Llama-3.1-8B-Instruct. It is specifically fine-tuned for Chinese and English users, excelling in roleplay, function calling, and mathematical capabilities. The model supports a context length of 128K tokens and is optimized for diverse conversational applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p