MaziyarPanahi/TheTop-5x7B-Instruct-S2-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MaziyarPanahi/TheTop-5x7B-Instruct-S2-v0.1 is a 7 billion parameter instruction-tuned language model created by MaziyarPanahi. This model is a merge of top 7B models using the SLERP method, designed to leverage the strengths of multiple base models. It achieves an average score of 72.57 on the Open LLM Leaderboard, demonstrating strong general-purpose performance across various benchmarks including reasoning and common sense tasks. The model is suitable for a wide range of generative AI applications requiring a balanced performance profile.

Loading preview...