usersina/math-llm-sit-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 2, 2026License:mitArchitecture:Transformer Open Weights Cold
The usersina/math-llm-sit-7b is a 7.6 billion parameter language model fine-tuned for mathematical reasoning tasks. Based on the Qwen2.5-7B-Instruct architecture, it utilizes a unique 4-phase Specialized Intelligence Theory (SIT) pipeline for enhanced performance. This model is specifically optimized for solving complex mathematical problems and integrals, offering a 32768 token context length.
Loading preview...