Weyaxi/SlimOpenOrca-Mistral-7B-v2

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 11, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/SlimOpenOrca-Mistral-7B-v2 is a 7 billion parameter language model based on the Mistral architecture, created by merging Open-Orca's Mistral-7B-SlimOrca and Mistral-7B-OpenOrca models. This merged model is designed for general-purpose conversational AI and instruction following, leveraging the strengths of its constituent models. It achieves an average score of 52.96 on the Open LLM Leaderboard, demonstrating capabilities across various benchmarks including ARC, HellaSwag, and MMLU.

Loading preview...

Model Overview

Weyaxi/SlimOpenOrca-Mistral-7B-v2 is a 7 billion parameter language model built upon the Mistral architecture. It was created by merging two prominent models from Open-Orca: Mistral-7B-SlimOrca and Mistral-7B-OpenOrca. The merge utilized a ties merge approach, with weights distributed as 0.6 for SlimOrca and 0.4 for OpenOrca, and equal density contributions.

Key Capabilities

  • General Instruction Following: Designed to respond effectively to a wide range of user instructions and prompts.
  • Reasoning and Common Sense: Exhibits capabilities in tasks requiring reasoning, as indicated by its performance on benchmarks like ARC and HellaSwag.
  • Knowledge-based Question Answering: Demonstrates proficiency in general knowledge tasks, reflected in its MMLU score.

Performance Benchmarks

Evaluated on the Open LLM Leaderboard, the model achieved an average score of 52.96. Specific benchmark results include:

  • ARC (25-shot): 62.88
  • HellaSwag (10-shot): 83.41
  • MMLU (5-shot): 62.05
  • TruthfulQA (0-shot): 56.65
  • Winogrande (5-shot): 77.58

Good For

  • Conversational AI applications requiring a balanced instruction-tuned model.
  • Tasks benefiting from strong general reasoning and common sense.
  • Developers seeking a 7B parameter model with a solid foundation from established Open-Orca fine-tunes.