google/medgemma-27b-text-it
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:May 19, 2025License:health-ai-developer-foundationsArchitecture:Transformer0.4K Gated Warm

MedGemma 27B is a 27 billion parameter, text-only, instruction-tuned variant of the Gemma 3 model developed by Google. It is specifically trained on medical text and optimized for inference-time computation, making it suitable for accelerating healthcare-based AI applications. This model excels at medical knowledge and reasoning tasks, outperforming base Gemma models on clinically relevant benchmarks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p