kairawal/Qwen3-0.6B-EL-SynthDolly-1A-E5
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The kairawal/Qwen3-0.6B-EL-SynthDolly-1A-E5 is a 0.8 billion parameter Qwen3 model developed by kairawal, fine-tuned from unsloth/qwen3-0.6b. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a context length of 32768 tokens, it is optimized for efficient performance in language generation tasks.
Loading preview...