MaziyarPanahi/TheTop-5x7B-Instruct-T-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TheTop-5x7B-Instruct-T-v0.1 by MaziyarPanahi is a 7 billion parameter instruction-tuned language model created by merging top 7B models using the TIES method. This model demonstrates strong general reasoning capabilities, achieving an average score of 74.96 on the Open LLM Leaderboard. It is particularly effective for tasks requiring broad knowledge and logical inference, with a context length of 4096 tokens.

Loading preview...