cstr/llama3-8b-spaetzle-v13
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:llama3Architecture:Transformer Warm
cstr/llama3-8b-spaetzle-v13 is an 8 billion parameter language model merged from Azure99/blossom-v5-llama3-8b and VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct, utilizing a context length of 8192 tokens. This model demonstrates strong performance in both German and English, achieving an EQ Bench v2_de score of 64.14 and an English EQ-Bench Score (v2) of 75.59. It is particularly well-suited for general-purpose conversational AI and tasks requiring robust language understanding in both languages.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p