Gille/StrangeMerges_3-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 27, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_3-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging FelixChao/WestSeverus-7B-DPO-v2 and Gille/StrangeMerges_1-7B-slerp using a slerp method. This model demonstrates strong general reasoning capabilities, achieving an average score of 74.57 on the Open LLM Leaderboard. It is suitable for a variety of general-purpose language generation tasks, particularly those benefiting from its balanced performance across multiple benchmarks.

Loading preview...