vkasera/v2_qwen-2.5-1.5b-r1-countdown-phil
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Oct 5, 2025Architecture:Transformer Cold

The vkasera/v2_qwen-2.5-1.5b-r1-countdown-phil model is a 1.5 billion parameter language model, fine-tuned from Qwen/Qwen2.5-1.5B-Instruct. It was trained using the GRPO method, which is designed to enhance mathematical reasoning capabilities. With a context length of 32768 tokens, this model is optimized for tasks requiring robust logical and mathematical processing.

Loading preview...