Skywork/Skywork-OR1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 13, 2025Architecture:Transformer0.0K Cold

Skywork/Skywork-OR1-7B is a 7.6 billion parameter Open Reasoner model developed by Skywork, specifically designed for advanced math and code reasoning tasks. Trained using large-scale rule-based reinforcement learning with curated datasets, it achieves competitive performance against similarly sized models in both mathematical problem-solving and coding scenarios. This model is optimized for applications requiring robust and consistent reasoning capabilities across complex quantitative and programming challenges. It features a substantial context length of 131072 tokens, supporting extensive problem descriptions and codebases.

Loading preview...