MaziyarPanahi/TheTop-5x7B-Instruct-D-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MaziyarPanahi/TheTop-5x7B-Instruct-D-v0.1 is a 7 billion parameter instruction-tuned language model created by MaziyarPanahi, formed by merging top 7B models using the DARE method. This model demonstrates strong performance across various benchmarks, including an average score of 74.54 on the Open LLM Leaderboard, with notable results in reasoning and common sense tasks. It is designed for general-purpose instruction following and excels in areas like HellaSwag (88.21) and Winogrande (84.37).

Loading preview...