Ligeng-Zhu/Qwen2.5-Math-7B-32k

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Qwen2.5-Math-7B-32k is a 7.6 billion parameter mathematical large language model developed by Qwen, featuring a 32k context length. It is specifically designed for solving mathematical problems in both English and Chinese, utilizing Chain-of-Thought (CoT) and Tool-integrated Reasoning (TIR) for enhanced accuracy. This model is optimized for complex mathematical and algorithmic reasoning tasks, making it suitable for applications requiring precise computational capabilities.

Loading preview...

Qwen2.5-Math-7B-32k: Specialized Mathematical LLM

Qwen2.5-Math-7B-32k is part of the Qwen2.5-Math series, an upgrade to the original Qwen2-Math family, developed by Qwen. This 7.6 billion parameter model is specifically engineered for mathematical problem-solving, supporting both English and Chinese languages.

Key Capabilities

  • Enhanced Reasoning: Supports both Chain-of-Thought (CoT) and Tool-integrated Reasoning (TIR) to improve accuracy and handle complex mathematical problems.
  • Multilingual Math: Capable of solving mathematical problems in both English and Chinese.
  • Computational Precision: TIR significantly boosts the model's proficiency in precise computation, symbolic manipulation, and algorithmic reasoning, addressing limitations of CoT alone.
  • Performance: Achieves strong results on the MATH benchmark using TIR, with the 7B-Instruct variant scoring 85.3.

Good For

  • Mathematical Problem Solving: Ideal for applications requiring robust solutions to math problems, from basic arithmetic to complex algorithmic tasks.
  • Research and Development: Serves as a strong base model for fine-tuning on specific mathematical domains or for further research into LLM reasoning capabilities.
  • Educational Tools: Can be integrated into systems designed to assist with mathematical learning and problem-solving.