kumapo/qwen3-0.6b-sft-lora-rank2048-2phase
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Oct 3, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The kumapo/qwen3-0.6b-sft-lora-rank2048-2phase is a 0.8 billion parameter Qwen3 model developed by kumapo, fine-tuned using Unsloth and Huggingface's TRL library. This model is optimized for efficient training, achieving 2x faster finetuning. It is designed for general language tasks, leveraging its efficient training methodology to provide a capable base for various applications.

Loading preview...