Technoculture/BioMistral-Carpybara-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 21, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Technoculture/BioMistral-Carpybara-Slerp is a 7 billion parameter language model created by Technoculture, formed by merging BioMistral/BioMistral-7B-DARE and argilla/CapybaraHermes-2.5-Mistral-7B using a slerp method. This model leverages the strengths of its base components, particularly in medical and general conversational domains, with a 4096-token context length. It is designed for applications requiring a blend of specialized biomedical knowledge and broad conversational capabilities.
Loading preview...