Open-Orca/Mistral-7B-SlimOrca
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 8, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Open-Orca/Mistral-7B-SlimOrca is a 7 billion parameter language model developed by Open-Orca, fine-tuned on the Mistral 7B architecture. It leverages a curated subset of the OpenOrca dataset, known as SlimOrca, which includes approximately 500k GPT-4 completions. This model demonstrates near-parity performance with larger models on the HuggingFace Leaderboard, achieving an average score of 65.85, making it efficient for general-purpose instruction-following tasks.

Loading preview...