alibaba-pai/DistillQwen-ThoughtY-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The alibaba-pai/DistillQwen-ThoughtY-32B is a 32 billion parameter causal language model from the DistillQwen-ThoughtY series, developed by Alibaba-PAI. It is specifically optimized for enhanced Chain-of-Thought (CoT) reasoning, outperforming previous versions and Qwen3 in complex mathematical, scientific, and coding tasks. This model leverages the OmniThought-0528 dataset, a 365K high-quality CoT dataset, to achieve state-of-the-art performance in reasoning-intensive applications. It is designed for use cases requiring robust step-by-step problem-solving capabilities.

Loading preview...