Gille/StrangeMerges_48-7B-dare_ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 26, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_48-7B-dare_ties is a 7 billion parameter language model created by Gille, formed by merging three distinct models: StrangeMerges_46-7B-dare_ties, Percival_01-7b-slerp, and StrangeMerges_47-7B-dare_ties, using the dare_ties merge method. This model is built upon the Locutusque/Hercules-4.0-Mistral-v0.2-7B base and achieves an average score of 57.89 on the Open LLM Leaderboard, demonstrating capabilities across various reasoning and language understanding tasks. It is suitable for general-purpose text generation and understanding applications where a 7B parameter model is appropriate.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p