dizza01/BioMistral-7B-DARE
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

dizza01/BioMistral-7B-DARE is a 7 billion parameter language model based on the Mistral-7B-Instruct-v0.1 architecture, specifically merged using the DARE TIES method. It is further pre-trained on PubMed Central data, making it specialized for biomedical and medical domains. This model excels in medical question-answering tasks, demonstrating superior performance compared to other open-source medical models.

Loading preview...