OsakanaTeishoku/Qwen3-4B-Thinking-2507-reasoning-ja-20260329
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
OsakanaTeishoku/Qwen3-4B-Thinking-2507-reasoning-ja-20260329 is a 4 billion parameter Qwen3-based causal language model developed by OsakanaTeishoku. Fine-tuned on the DataPilot/Knowledge-QA-SingleTurn-Dataset, this model is specifically designed to generate Japanese reasoning responses to Japanese inputs. It features a context length of 16384 tokens and is optimized for knowledge-based question answering in Japanese.
Loading preview...