pszemraj/medgemma-4b-it-heretic
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Nov 17, 2025License:health-ai-developer-foundationsArchitecture:Transformer0.0K Cold

The pszemraj/medgemma-4b-it-heretic is a 4.3 billion parameter instruction-tuned multimodal language model, derived from Google's MedGemma-4B-IT, with a 32768 token context length. This variant has been decensored using the Heretic v1.0.1 tool, significantly reducing refusals compared to the original model. It is optimized for medical text and image comprehension, excelling in tasks like medical image classification, visual question answering, and text-based medical knowledge, making it suitable for healthcare AI applications requiring less restrictive responses.

Loading preview...