FritzStack/HiTOP-MedGemma4B-merged
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Jan 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

FritzStack/HiTOP-MedGemma4B-merged is a 4.3 billion parameter Gemma3 model developed by FritzStack, fine-tuned from unsloth/medgemma-4b-pt. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x speedup in the fine-tuning process. It is designed for applications requiring efficient and accelerated training of Gemma3-based language models.

Loading preview...