Gille/StrangeMerges_29-7B-dare_ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 21, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_29-7B-dare_ties is a 7 billion parameter language model created by Gille, formed by merging Gille/StrangeMerges_21-7B-slerp and CultriX/MonaTrix-v4 using the dare_ties method. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.09 on the Open LLM Leaderboard, with notable performance in common sense reasoning and question answering. It is suitable for a variety of general-purpose language generation tasks.

Loading preview...