dnotitia/Qwen3-4B-Thinking-2507
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Jan 26, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

Qwen3-4B-Thinking-2507 is a 4.0 billion parameter causal language model developed by Qwen, specifically enhanced for complex reasoning tasks. This model features a native context length of 262,144 tokens and significantly improved performance across logical reasoning, mathematics, science, coding, and academic benchmarks. It is optimized for 'thinking mode' and excels in scenarios requiring deep analytical processing and long-context understanding.

Loading preview...