LogicStar/SWE-Star-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 17, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold

LogicStar's SWE-Star-32B is a 32.8 billion parameter language model based on the Qwen2.5-Coder architecture, specifically fine-tuned for agentic coding tasks. It was trained on 250k agentic coding trajectories distilled from Devstral-2-Small using SWE-Smith tasks. This model excels at automated software engineering, demonstrating significant performance improvements on the SWE-Bench Verified benchmark for Python coding.

Loading preview...