Ismailea04/medgemma-4b-cardiology-merged

VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Feb 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Ismailea04/medgemma-4b-cardiology-merged is a 4.3 billion parameter Gemma3 model developed by Ismailea04. This model was finetuned from unsloth/medgemma-4b-it-unsloth-bnb-4bit, utilizing Unsloth and Huggingface's TRL library for accelerated training. It is optimized for specific applications, likely within the cardiology domain given its name, and offers a 32768 token context length.

Loading preview...

Model Overview

Ismailea04/medgemma-4b-cardiology-merged is a 4.3 billion parameter language model, developed by Ismailea04. It is based on the Gemma3 architecture and was finetuned from the unsloth/medgemma-4b-it-unsloth-bnb-4bit model.

Key Characteristics

  • Architecture: Gemma3
  • Parameter Count: 4.3 billion parameters
  • Context Length: 32768 tokens
  • Training Efficiency: The model was trained using Unsloth and Huggingface's TRL library, which enabled a 2x faster finetuning process.
  • License: Apache-2.0

Potential Use Cases

Given its name, "medgemma-4b-cardiology-merged," this model is likely specialized for applications within the cardiology domain. Its finetuning on a medical base model suggests potential utility in:

  • Processing and understanding medical texts related to cardiology.
  • Assisting with medical information retrieval in cardiovascular health.
  • Supporting research or educational tools focused on cardiology.