ozayezerceli/Qwen3-4B-Inst-CoTsft
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 22, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The ozayezerceli/Qwen3-4B-Inst-CoTsft is a 4 billion parameter instruction-tuned causal language model, developed by ozayezerceli. This model is a finetuned version of unsloth/Qwen3-4B-Instruct-2507, optimized for efficiency using Unsloth and Huggingface's TRL library. It offers a 40960 token context length, making it suitable for tasks requiring extensive context processing.

Loading preview...