pszemraj/medgemma-27b-text-heretic_med
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Nov 18, 2025License:health-ai-developer-foundationsArchitecture:Transformer0.0K Cold

pszemraj/medgemma-27b-text-heretic_med is a 27 billion parameter, text-only, instruction-tuned decoder-only transformer model based on Google's MedGemma 27B. This variant has been decensored using the Heretic v1.0.1 tool, specifically designed to reduce refusals compared to the original MedGemma model. It maintains a 32768 token context length and is primarily intended for healthcare-based AI applications requiring a medical assistant that provides less restricted responses.

Loading preview...