sail/Qwen2.5-Math-1.5B-Oat-Zero
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 17, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
sail/Qwen2.5-Math-1.5B-Oat-Zero is a 1.5 billion parameter language model developed by sail, fine-tuned for mathematical reasoning tasks. Based on the Qwen2.5-Math-1.5B architecture, it utilizes the minimalist R1-Zero training recipe with the Dr. DRPO algorithm on level 3-5 questions from the MATH dataset. This model is specifically optimized for solving complex mathematical problems, demonstrating its capabilities on widely used math benchmarks.
Loading preview...