prithivMLmods/rStar-Coder-Qwen3-0.6B
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Aug 5, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

prithivMLmods/rStar-Coder-Qwen3-0.6B is an 0.8 billion parameter model fine-tuned on Qwen-0.6B using the rStar-Coder dataset, enhanced with code expert clusters and an extended open code reasoning dataset. It features a 40960 token context length and excels at unified reasoning across code, mathematics, and scientific logic. This model is optimized for advanced code generation, scientific problem-solving, and structured output in formats like LaTeX and JSON, making it ideal for developers, educators, and researchers seeking high symbolic fidelity on mid-range GPUs and edge AI systems.

Loading preview...