opencsg/csg-wukong-1B
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The csg-wukong-1B is a 1.1 billion-parameter small language model (SLM) developed by OpenCSG. Pretrained on 1 trillion tokens, this model is designed for efficient language tasks. It has achieved a notable 8th rank among approximately 1.5B pretrained small language models on the open_llm_leaderboard, indicating strong performance within its size class. This model is suitable for applications requiring a compact yet capable language model.
Loading preview...