rikunarita/Qwen3-4B-Thinking-2507-Genius-Coder
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 9, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold

rikunarita/Qwen3-4B-Thinking-2507-Genius-Coder is a 4.0 billion parameter causal language model, fine-tuned by rikunarita from Qwen3-4B-Thinking-2507 using the TeichAI/gpt-5.1-codex-max-1000x dataset. This model is specifically enhanced for complex reasoning tasks, coding, and agentic use, featuring a native context length of 262,144 tokens. It demonstrates significantly improved performance across logical reasoning, mathematics, science, and coding benchmarks, making it suitable for applications requiring deep analytical capabilities.

Loading preview...