zTensor/Qwen2.5-Math-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

zTensor/Qwen2.5-Math-1.5B is a 1.5 billion parameter mathematical language model developed by Qwen, part of the upgraded Qwen2.5-Math series. This model is specifically designed for solving English and Chinese math problems, supporting both Chain-of-Thought (CoT) and Tool-integrated Reasoning (TIR) for enhanced computational accuracy. With a 32768 token context length, it excels in complex mathematical and algorithmic reasoning tasks.

Loading preview...