anujjamwal/hcot-qwen2.5-math-1.5b
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 23, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

The anujjamwal/hcot-qwen2.5-math-1.5b is a 1.5 billion parameter language model fine-tuned by anujjamwal, based on the Qwen2.5-Math-1.5B architecture. This model is specifically optimized for mathematical reasoning tasks, leveraging its base model's capabilities. With a context length of 32768 tokens, it is designed to handle complex mathematical problems and related computational queries. Its primary strength lies in its specialized fine-tuning for enhanced performance in mathematical domains.

Loading preview...