allenai/SERA-32B-GA
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Jan 27, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

SERA-32B-GA is a 32 billion parameter open-source coding agent developed by Allen Institute for AI (Ai2), built upon the Qwen 3-32B base model. It is fine-tuned using synthetic agent trajectories and achieves 46.6% on SWE-bench Verified, making it a top-performing open-source model for automated software engineering tasks. This model is optimized for bug fixes, feature implementation, and refactoring within a 32K token context length.

Loading preview...