Qwen3-32B is a 32.8 billion parameter causal language model from the Qwen series, developed by Qwen. It uniquely supports seamless switching between a 'thinking mode' for complex reasoning tasks like math and coding, and a 'non-thinking mode' for efficient general dialogue. With a native context length of 32,768 tokens, extendable to 131,072 with YaRN, it excels in reasoning, instruction-following, agent capabilities, and multilingual support across over 100 languages.
No reviews yet. Be the first to review!