Skywork/Skywork-SWE-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jun 11, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

Skywork-SWE-32B is a 32.8 billion parameter code agent model developed by Skywork AI, specifically designed for software engineering (SWE) tasks. Built on the Qwen2.5-Coder-32B-Instruct backbone, it achieves 38.0% pass@1 accuracy on the SWE-bench Verified benchmark, outperforming previous open-source models in its class. With a notable context length of 131072 tokens, this model excels at automated code generation and bug fixing within comprehensive executable runtime environments. Its performance can further improve to 47.0% accuracy with test-time scaling techniques, making it suitable for complex software development workflows.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p