Ligeng-Zhu/Qwen2.5-Math-7B-32k
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Qwen2.5-Math-7B-32k is a 7.6 billion parameter mathematical large language model developed by Qwen, featuring a 32k context length. It is specifically designed for solving mathematical problems in both English and Chinese, utilizing Chain-of-Thought (CoT) and Tool-integrated Reasoning (TIR) for enhanced accuracy. This model is optimized for complex mathematical and algorithmic reasoning tasks, making it suitable for applications requiring precise computational capabilities.

Loading preview...