ASAIs-TDDI-2025/MedTurk-MedGemma-4b
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Aug 11, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
MedTurk-MedGemma-4b by ASAIs-TDDI-2025 is a 4.3 billion parameter Gemma3-based causal language model. This model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its efficient finetuning process for optimized performance.
Loading preview...