bill-95/medgemma-4b-ft
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Feb 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The bill-95/medgemma-4b-ft is a 4.3 billion parameter Gemma 3 model, fine-tuned by bill-95. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x speedup in the fine-tuning process. It is optimized for efficient deployment and performance, making it suitable for applications requiring a compact yet capable language model.

Loading preview...