MVPRM/Qwen3-0.6B-Base-CPT-Math
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026Architecture:Transformer Warm
MVPRM/Qwen3-0.6B-Base-CPT-Math is a 0.8 billion parameter language model from the Qwen3 family, developed by MVPRM. This base model is designed for general language understanding and generation tasks. Its compact size makes it suitable for applications requiring efficient deployment and lower computational resources. The model serves as a foundational component for further fine-tuning on specific downstream applications.
Loading preview...