david830729/qwen2_5_1_5b_demo
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The david830729/qwen2_5_1_5b_demo is a 1.5 billion parameter language model, likely based on the Qwen2.5 architecture, designed for general language understanding and generation tasks. With a context length of 32768 tokens, it is suitable for processing moderately long inputs. Its primary utility lies in demonstrating the capabilities of the Qwen2.5 model family at a smaller scale.

Loading preview...