Hyeji0101/qwen2_5_1_5b_demo
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The Hyeji0101/qwen2_5_1_5b_demo is a 1.5 billion parameter language model with a 32768-token context length. This model is a demonstration of the Qwen2.5 architecture, developed by Hyeji0101. It serves as a foundational model for various natural language processing tasks, offering a balance between performance and computational efficiency.

Loading preview...