arcee-ai/saul-mistral-v0.1-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
arcee-ai/saul-mistral-v0.1-7b-slerp is a 7 billion parameter language model merged from Equall/Saul-Base and mistralai/Mistral-7B-Instruct-v0.1 using the slerp method. This model leverages the strengths of both base models, combining their weights to potentially enhance performance across various tasks. It maintains a context length of 4096 tokens, making it suitable for general-purpose language generation and instruction-following applications.
Loading preview...