tiansz/ChatYuan-7B-merge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold

The tiansz/ChatYuan-7B-merge is a 7 billion parameter LLaMA-based large language model developed by tiansz, designed for Chinese-English dialogue. This model specializes in bilingual conversational tasks, leveraging its foundation on the LLaMA architecture to provide robust performance in mixed-language interactions. It is optimized for generating responses in a dialogue format, making it suitable for chatbot applications requiring both Chinese and English language capabilities.

Loading preview...