AlienKevin/swe-smith-rs-base-qwen2.5-coder-32b-instruct-teacher-glm-4.6
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The AlienKevin/swe-smith-rs-base-qwen2.5-coder-32b-instruct-teacher-glm-4.6 model is an instruction-tuned 32.5 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. It is specifically optimized for advanced code generation, reasoning, and fixing, building upon the Qwen2.5 architecture with 5.5 trillion training tokens. This model features a 131,072-token context length and is designed for real-world code agent applications while maintaining strong general and mathematical competencies.

Loading preview...