yashroff/gemma-3-1b-medical-finetuned
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026Architecture:Transformer Warm

The yashroff/gemma-3-1b-medical-finetuned model is a 1 billion parameter language model, fine-tuned from the Gemma architecture. This model is specifically adapted for medical applications, leveraging its base architecture for specialized tasks within the healthcare domain. With a context length of 32768 tokens, it is designed to process and generate medical-related text efficiently. Its primary strength lies in its focused fine-tuning for medical use cases, distinguishing it from general-purpose LLMs.

Loading preview...