BioMistral/BioMistral-7B-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

BioMistral/BioMistral-7B-SLERP is a 7 billion parameter language model developed by BioMistral, created by merging BioMistral/BioMistral-7B and mistralai/Mistral-7B-Instruct-v0.1 using the SLERP method. This model is specifically tailored for the biomedical domain, leveraging further pre-training on PubMed Central data. It demonstrates superior performance on medical question-answering tasks compared to other open-source medical models, making it suitable for biomedical research applications.

Loading preview...