Gille/StrangeMerges_28-7B-dare_ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 21, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_28-7B-dare_ties is a 7 billion parameter language model created by Gille, developed using the dare_ties merge method on a CultriX/MonaTrix-v4 base model. It integrates components from eren23/ogno-monarch-jaskier-merge-7b-v2 and Gille/StrangeMerges_25-7B-dare_ties. This model achieves an average score of 75.86 on the Open LLM Leaderboard, demonstrating strong general reasoning and language understanding capabilities, making it suitable for a variety of common NLP tasks.

Loading preview...