allenai/SERA-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Jan 27, 2026License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

SERA-32B is a 32-billion parameter open-source coding agent developed by Allen Institute for AI (Ai2), built on the Qwen 3-32B base model. It achieves 49.5% on SWE-bench Verified, matching frontier open models and larger proprietary models, and was trained using Soft Verified Generation (SVG), a cost-efficient method. This model is primarily designed for automated software engineering tasks like bug fixes, feature implementation, and refactoring, and supports a 32K token context length.

Loading preview...