zwhe99/DeepMath-Zero-Math-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 22, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold

DeepMath-Zero-Math-7B is a 7.6 billion parameter language model developed by zwhe99, fine-tuned from Qwen2.5-Math-7B using reinforcement learning on the DeepMath-103K dataset. This model is specifically optimized for advanced mathematical reasoning, excelling at challenging problems across algebra, calculus, number theory, geometry, probability, and discrete mathematics. It features a 32768-token context length and achieves state-of-the-art results on demanding math benchmarks.

Loading preview...