RoadQAQ/Qwen2.5-Math-7B-16k-think
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jun 8, 2025License:mitArchitecture:Transformer Open Weights Cold

RoadQAQ/Qwen2.5-Math-7B-16k-think is a 7.6 billion parameter language model based on the Qwen2.5-Math-7B architecture, developed by RoadQAQ. It features an extended context window of 16k tokens and a modified rope_theta for improved long-context understanding. This model is specifically fine-tuned for mathematical reasoning and problem-solving, incorporating a unique token and a custom chat template to enhance its ability to process complex mathematical queries.

Loading preview...