FritzStack/HiTOP-MedGemma4B-merged

Hugging Face
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Jan 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

FritzStack/HiTOP-MedGemma4B-merged is a 4.3 billion parameter Gemma3 model developed by FritzStack, fine-tuned from unsloth/medgemma-4b-pt. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x speedup in the fine-tuning process. It is designed for applications requiring efficient and accelerated training of Gemma3-based language models.

Loading preview...

Model Overview

FritzStack/HiTOP-MedGemma4B-merged is a 4.3 billion parameter language model based on the Gemma3 architecture, developed by FritzStack. It was fine-tuned from the unsloth/medgemma-4b-pt model.

Key Capabilities

  • Accelerated Fine-tuning: This model was fine-tuned with a 2x speed improvement using Unsloth and Huggingface's TRL library, indicating an optimization for efficient training workflows.
  • Gemma3 Architecture: Built upon the Gemma3 foundation, it inherits the core capabilities of this model family.

Good For

  • Efficient Development: Ideal for developers and researchers looking to leverage Gemma3 models with significantly reduced fine-tuning times.
  • Resource-Constrained Environments: The optimized training process makes it suitable for scenarios where computational resources or time are limited.

License

The model is released under the Apache-2.0 license.