aparnavirtuonai/mistral-medqa
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 18, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
aparnavirtuonai/mistral-medqa is a 7 billion parameter language model fine-tuned from mistralai/Mistral-7B-v0.1, specifically optimized for medical question-answering tasks. This model leverages Supervised Fine-Tuning (SFT) with LoRA on the MedQA dataset to enhance its ability to process and respond to medical queries. It is designed for applications requiring domain-specific reasoning in healthcare, education, and research, demonstrating stable general performance and baseline medical domain adaptation.
Loading preview...