kairawal/Qwen3-0.6B-PT-SynthDolly-1A-E5
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 4, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

kairawal/Qwen3-0.6B-PT-SynthDolly-1A-E5 is a 0.8 billion parameter Qwen3 model developed by kairawal, fine-tuned from unsloth/qwen3-0.6b. This model was trained 2x faster using Unsloth and Huggingface's TRL library, offering efficient performance for its size. With a 32768 token context length, it is optimized for tasks requiring processing of longer sequences. Its efficient training methodology makes it suitable for applications where rapid deployment and resource optimization are key.

Loading preview...