Gille/StrangeMerges_50-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 26, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Gille/StrangeMerges_50-7B-slerp is a 7 billion parameter language model created by Gille, formed by merging liminerity/M7-7b and Gille/StrangeMerges_49-7B-dare_ties using the slerp method. This model leverages a specific parameter weighting for self-attention and MLP layers, resulting in an average Open LLM Leaderboard score of 76.31. It is suitable for general language generation tasks, demonstrating strong performance across various reasoning and common sense benchmarks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p