GAIR/OpenSWE-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 15, 2026License:qwenArchitecture:Transformer0.0K Cold

GAIR/OpenSWE-32B is a 32.8 billion parameter model developed by GAIR, specifically fine-tuned for software engineering tasks. It is trained on the OpenSWE dataset, which comprises 45,320 executable Docker environments from 12.8k repositories, and excels at automated code generation and bug fixing. This model achieves 62.4% on SWE-bench Verified, establishing a new state-of-the-art among SFT-based methods in the Qwen2.5 series for software development environments.

Loading preview...