Gille/StrangeMerges_22-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_22-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_21-7B-slerp and paulml/OGNO-7B using the slerp method. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.16 on the Open LLM Leaderboard, including 73.72 on AI2 Reasoning Challenge and 64.80 on MMLU. Its primary use case is general-purpose text generation and reasoning tasks, leveraging its merged architecture for balanced performance.

Loading preview...