Skywork/Skywork-OR1-7B-Preview
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 13, 2025Architecture:Transformer0.0K Cold

Skywork-OR1-7B-Preview is a 7.6 billion parameter general-purpose reasoning model developed by Skywork, part of the Skywork-OR1 (Open Reasoner 1) series. This model is specifically optimized for mathematical and coding reasoning tasks, outperforming similarly sized models in both domains. It is trained using large-scale rule-based reinforcement learning with carefully curated datasets and a customized GRPO approach.

Loading preview...