BioMistral/BioMistral-7B-DARE
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

BioMistral/BioMistral-7B-DARE is a 7 billion parameter language model developed by BioMistral, based on the Mistral-7B-Instruct-v0.1 architecture and further pre-trained on PubMed Central. This model is a merge created using the DARE TIES method, specifically optimized for biomedical and medical question-answering tasks. It demonstrates superior performance compared to other open-source medical models and competitive results against proprietary counterparts in medical domains.

Loading preview...