taketakedaiki/qwen3-4b-v2-exp28
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 1, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The taketakedaiki/qwen3-4b-v2-exp28 is a 4 billion parameter Qwen3 model, fine-tuned by taketakedaiki. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. It is designed for general language tasks, leveraging its efficient training methodology to provide a capable and optimized solution.

Loading preview...