uygarkurt/llama-3-merged-linear
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 9, 2024License:mitArchitecture:Transformer0.0K Open Weights Warm
The uygarkurt/llama-3-merged-linear is an 8 billion parameter language model created by uygarkurt, resulting from the linear merging of the top three Llama-3 models from the Open LLM Leaderboard. This model leverages the mergekit library to combine existing models without additional training, aiming to achieve improved performance. It is specifically designed to create a better-ranking model by integrating the strengths of its constituent Llama-3 base models.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p