lightbringergloba/Qwen2.5-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 4, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

The lightbringergloba/Qwen2.5-1.5B is a 1.54 billion parameter causal language model from the Qwen2.5 series, developed by Qwen Team. This base model features a 32,768 token context length and is designed with transformers architecture including RoPE, SwiGLU, and RMSNorm. It offers significantly improved capabilities in coding, mathematics, instruction following, and structured data understanding, making it suitable for further fine-tuning for specialized applications.

Loading preview...