zjunlp/knowlm-13b-base-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The zjunlp/knowlm-13b-base-v1.0 is a 13 billion parameter language model built upon the LLaMA-13b architecture. It features a 4096-token context length and has undergone a secondary full-scale pretraining phase using both Chinese and English bilingual data. This additional training significantly enhances the model's comprehension of Chinese language, making it suitable for bilingual applications.

Loading preview...