Gille/StrangeMerges_21-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_21-7B-slerp is a 7 billion parameter language model created by Gille, developed through a slerp merge of StrangeMerges_20-7B-slerp and NeuTrixOmniBe-7B-model-remix. This model is designed for general language tasks, leveraging its merged architecture to achieve an average score of 76.29 on the Open LLM Leaderboard. It demonstrates strong performance across various benchmarks, including 88.95 on HellaSwag and 84.61 on Winogrande, making it suitable for diverse applications requiring robust language understanding and generation.

Loading preview...