taki555/Qwen3-4B-Thinking-2507-Art
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

taki555/Qwen3-4B-Thinking-2507-Art is a 4 billion parameter Qwen3-based causal language model, derived from Qwen3-4B-Thinking-2507. It is specifically optimized for efficient Chain-of-Thought (CoT) reasoning, aiming to provide accurate thinking trajectories with reduced computational overhead. This model excels at maintaining high performance across varying token budgets, making it suitable for reasoning tasks where efficiency is critical.

Loading preview...