arcee-ai/Mistral-Instruct-Orca-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mistral-Instruct-Orca-Slerp by arcee-ai is a 7 billion parameter language model with a 4096 token context length, created by merging Mistral-7B-Instruct-v0.2 and Mistral-7B-OpenOrca using the slerp method. This model combines the instruction-following capabilities of Mistral-Instruct with the Orca-style fine-tuning for enhanced reasoning and conversational performance. It is designed for general-purpose instruction-following tasks, leveraging the strengths of its constituent models.

Loading preview...