cczhong/llama2-chinese-7b-chat-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

cczhong/llama2-chinese-7b-chat-merged is a 7 billion parameter Llama 2-based language model, merged from FlagAlpha's Llama2-Chinese-7b-Chat-LoRA. This model is specifically designed for Chinese language chat applications, leveraging the Llama 2 architecture for robust performance. It offers a 4096-token context length, making it suitable for conversational AI in Chinese.

Loading preview...