zero0303/medgemma-1.5-4b-it
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Jan 19, 2026License:health-ai-developer-foundationsArchitecture:Transformer Cold

MedGemma 1.5 4B IT is a 4.3 billion parameter multimodal instruction-tuned model developed by Google, based on the Gemma 3 architecture. It is specifically trained for medical text and image comprehension, excelling in tasks like interpreting high-dimensional medical imaging (CT/MRI), whole-slide histopathology, and electronic health record data. This model is optimized for accelerating healthcare-based AI applications requiring text generation from medical inputs.

Loading preview...