Gille/StrangeMerges_47-7B-dare_ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Gille/StrangeMerges_47-7B-dare_ties is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_46-7B-dare_ties, AurelPx/Percival_01-7b-slerp, and kaist-ai/mistral-orpo-beta using the dare_ties method. This model demonstrates a strong average performance of 71.91 on the Open LLM Leaderboard, with notable scores in reasoning (AI2 Reasoning Challenge: 69.45) and common sense (HellaSwag: 86.69, Winogrande: 82.24). It is suitable for general language understanding and generation tasks, particularly those requiring robust reasoning and factual recall.
Loading preview...