AiHub4MSRH-Hash/hash-MedGemma-27B-16bit-eng-text-it
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The AiHub4MSRH-Hash/hash-MedGemma-27B-16bit-eng-text-it model is a 27 billion parameter Gemma3_text model, fine-tuned by AiHub4MSRH-Hash. It was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. This model is optimized for English text tasks, leveraging its large parameter count and efficient training methodology.
Loading preview...