kairawal/Qwen3-4B-HI-SynthDolly-1A-E5
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
kairawal/Qwen3-4B-HI-SynthDolly-1A-E5 is a 4 billion parameter Qwen3 model developed by kairawal, fine-tuned using Unsloth and Huggingface's TRL library. This model leverages efficient training techniques to achieve faster finetuning. With a 32768 token context length, it is suitable for applications requiring substantial input processing. Its primary differentiator is the optimized training process, making it a resource-efficient option for various language generation tasks.
Loading preview...