MaziyarPanahi/TheTop-5x7B-Instruct-S4-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
MaziyarPanahi/TheTop-5x7B-Instruct-S4-v0.1 is a 7 billion parameter instruction-tuned language model created by MaziyarPanahi. This model is a merge of top 7B models and a SLERP of other 7B models, utilizing the mergekit toolkit for its creation. It demonstrates strong performance across various benchmarks, including an average score of 74.94 on the Open LLM Leaderboard, making it suitable for general-purpose conversational AI and reasoning tasks.
Loading preview...