ASAIs-TDDI-2025/MedTurk-MedGemma-4b
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Aug 11, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
MedTurk-MedGemma-4b by ASAIs-TDDI-2025 is a 4.3 billion parameter Gemma3-based causal language model. This model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its efficient finetuning process for optimized performance.
Loading preview...
Overview
MedTurk-MedGemma-4b is a 4.3 billion parameter language model developed by ASAIs-TDDI-2025. It is based on the Gemma3 architecture and was finetuned from unsloth/medgemma-4b-it-unsloth-bnb-4bit. The finetuning process utilized Unsloth and Huggingface's TRL library, which allowed for a significantly accelerated training time, reportedly 2x faster.
Key Capabilities
- Efficient Training: Leverages Unsloth for faster finetuning, making it more resource-efficient for developers.
- Gemma3 Architecture: Built upon the robust Gemma3 foundation, providing strong general language understanding and generation capabilities.
- General Purpose: Suitable for a wide range of natural language processing tasks due to its foundational model and finetuning approach.
Good For
- Developers looking for a Gemma3-based model with optimized training efficiency.
- Applications requiring a 4.3 billion parameter model for various language-related tasks.
- Experimentation with models finetuned using Unsloth and TRL for performance benefits.