arcee-ai/Biomistral-Exp-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Biomistral-Exp-Slerp is a 7 billion parameter language model created by arcee-ai, formed by merging BioMistral/BioMistral-7B and yam-peleg/Experiment26-7B using a slerp merge method. This model combines the strengths of its base components, likely offering enhanced performance in areas covered by the merged models. It is designed for general language tasks, leveraging a 4096-token context length.

Loading preview...