cstr/llama3-8b-spaetzle-v20
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:llama3Architecture:Transformer0.0K Warm
cstr/llama3-8b-spaetzle-v20 is an 8 billion parameter language model, a merge of cstr/llama3-8b-spaetzle-v13 and nbeerbower/llama-3-wissenschaft-8B-v2 using the dare_ties method. This model achieves an average score of 71.83 on the Open LLM Leaderboard across various benchmarks, including 70.39 on ARC and 68.52 on MMLU. It is designed for general language generation tasks, demonstrating solid performance across common reasoning and knowledge-based evaluations.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p