arcee-ai/Biomistral-Calme-Instruct-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Biomistral-Calme-Instruct-7b is a 7 billion parameter instruction-tuned language model created by arcee-ai, formed by merging MaziyarPanahi/Calme-7B-Instruct-v0.1.1 and BioMistral/BioMistral-7B. This model combines general instruction-following capabilities with specialized knowledge from the biomedical domain. It is designed for tasks requiring both broad conversational understanding and specific scientific or medical context.
Loading preview...
Biomistral-Calme-Instruct-7b Overview
Biomistral-Calme-Instruct-7b is a 7 billion parameter language model developed by arcee-ai. It is a merged model, combining the strengths of two distinct base models: MaziyarPanahi/Calme-7B-Instruct-v0.1.1 and BioMistral/BioMistral-7B.
Key Characteristics
- Hybrid Capabilities: This model integrates general instruction-following abilities from the Calme-7B-Instruct model with the specialized biomedical knowledge present in BioMistral-7B.
- Merge Method: The model was created using
mergekitwith a slerp merge method, carefully blending the weights of the constituent models across different layers. - Parameter Configuration: Specific
tvalues were applied to self-attention and MLP layers during the merge process to optimize the combination of features from both base models.
Potential Use Cases
- Biomedical Q&A: Answering questions that require both general understanding and specific knowledge in biology, medicine, or related scientific fields.
- Instruction Following: Executing complex instructions in contexts that might involve scientific terminology or concepts.
- Research Assistance: Aiding in tasks like summarizing scientific papers, generating hypotheses, or extracting information from biomedical texts, while maintaining conversational coherence.