hector-gr/RLCR-v4-ks-highcov-accgated-cold-math
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

hector-gr/RLCR-v4-ks-highcov-accgated-cold-math is a 7.6 billion parameter language model fine-tuned from Qwen/Qwen2.5-7B. Developed by hector-gr, this model was trained using the GRPO method, as introduced in the DeepSeekMath paper, to enhance mathematical reasoning capabilities. It is optimized for tasks requiring advanced mathematical problem-solving and logical deduction, leveraging a 32768 token context length.

Loading preview...