luisfsalazar/Qwen3-0.6B-Base-CPT-Math
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 16, 2026Architecture:Transformer Warm
The luisfsalazar/Qwen3-0.6B-Base-CPT-Math is an 0.8 billion parameter language model based on the Qwen3 architecture. This model is specifically designed and optimized for mathematical reasoning and computational tasks, leveraging its base architecture for efficient processing. It features a substantial 32768 token context length, making it suitable for complex problem-solving and detailed mathematical inquiries. Its primary strength lies in handling mathematical operations and logic, distinguishing it from general-purpose LLMs.
Loading preview...