google/medgemma-27b-text-it

Loading
Public
27B
FP8
32768
License: health-ai-developer-foundations
Hugging Face
Gated
Overview

MedGemma 27B: Specialized for Medical Text

MedGemma 27B is a 27 billion parameter, text-only, instruction-tuned model from Google, built upon the Gemma 3 architecture. It is specifically trained on a vast corpus of medical text to enhance performance in healthcare AI applications. This model is optimized for inference-time computation, making it efficient for deployment.

Key Capabilities

  • Medical Knowledge and Reasoning: Demonstrates strong performance across various medical text-only benchmarks, including MedQA, MedMCQA, PubMedQA, MMLU Med, MedXpertQA, and AfriMed-QA.
  • Enhanced Performance: Consistently outperforms its base Gemma 3 27B counterpart on all tested health benchmarks, with MedQA (4-op) achieving 89.8% and MMLU Med (text only) reaching 87.0%.
  • Instruction-Tuned: Available exclusively as an instruction-tuned model, ready for direct application in medical question answering and text generation tasks.
  • Long Context Support: Supports a context length of at least 128K tokens, enabling processing of extensive medical documents.

Intended Use

MedGemma 27B is designed as a foundational model for developers in the life sciences and healthcare sector. It serves as an efficient starting point for building and fine-tuning downstream healthcare applications that require robust medical text comprehension and generation. Developers are responsible for further validation and adaptation for specific clinical use cases.