BioMistral/BioMistral-7B-OpenHermes-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 3, 2024Architecture:Transformer Cold
BioMistral/BioMistral-7B-OpenHermes-SLERP is a 7 billion parameter language model created by merging teknium/OpenHermes-2-Mistral-7B and Project44/BioMistral-7B-0.1-PubMed-V2 using the SLERP method. This model combines the general conversational capabilities of OpenHermes with the specialized biomedical knowledge from BioMistral, making it suitable for tasks requiring both broad understanding and domain-specific expertise. It leverages a 4096-token context length, offering a balanced approach for applications in the biomedical field and general AI. Its unique merge strategy aims to enhance performance across diverse linguistic and scientific contexts.
Loading preview...