sail/Qwen2.5-Math-7B-Oat-Zero
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 17, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The sail/Qwen2.5-Math-7B-Oat-Zero model is a 7.6 billion parameter language model developed by sail, based on the Qwen2.5-Math-7B architecture. It is specifically fine-tuned using the minimalist R1-Zero recipe and Dr. DRPO algorithm on level 3-5 questions from the MATH dataset. This model is optimized for advanced mathematical reasoning and problem-solving tasks, demonstrating strong performance on widely used math benchmarks.

Loading preview...