kairawal/Gemma-3-4B-IT-PT-SynthDolly-1A-E5
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
kairawal/Gemma-3-4B-IT-PT-SynthDolly-1A-E5 is a 4.3 billion parameter language model developed by kairawal, fine-tuned from unsloth/gemma-3-4b-it. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It features a 32768 token context length, making it suitable for tasks requiring extensive context processing.
Loading preview...