Weyaxi/SlimOpenOrca-Mistral-7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/SlimOpenOrca-Mistral-7B-v2 is a 7 billion parameter language model based on the Mistral architecture, created by merging Open-Orca's Mistral-7B-SlimOrca and Mistral-7B-OpenOrca models. This merged model is designed for general-purpose conversational AI and instruction following, leveraging the strengths of its constituent models. It achieves an average score of 52.96 on the Open LLM Leaderboard, demonstrating capabilities across various benchmarks including ARC, HellaSwag, and MMLU.

Loading preview...