arcee-ai/BioMistral-merged-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 27, 2024Architecture:Transformer0.0K Cold
arcee-ai/BioMistral-merged-instruct is a 7 billion parameter instruction-tuned language model created by arcee-ai, built upon the Mistral-7B-v0.1 architecture. This model is a merge of Mistral-7B-Instruct-v0.2 and BioMistral-7B, specifically designed to combine general instruction following with specialized biomedical knowledge. It is optimized for tasks requiring both broad conversational abilities and domain-specific understanding in biology and medicine.
Loading preview...