LeoLM/leo-hessianai-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 5, 2023Architecture:Transformer0.0K Cold

LeoLM/leo-hessianai-13b is a 13 billion parameter causal decoder-only transformer language model developed by LAION and HessianAI. It extends Llama-2's capabilities into German through continued pretraining on a large corpus of German-language text, making it a linguistically enhanced open language model. This model is designed for German-language applications, offering strong performance in both English and German contexts. It is particularly suited for research and commercial LLM development focusing on the German language.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p