Aakash010/MedGemma_FineTuned
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Dec 21, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MedGemma_FineTuned by Aakash010 is a 4.3 billion parameter language model, fine-tuned from the Gemma architecture. This model is designed for specialized applications, leveraging its fine-tuned nature to potentially excel in specific domains. With a context length of 32768 tokens, it supports processing extensive inputs for detailed analysis.

Loading preview...