jphme/em_german_13b_v01

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Warm

The jphme/em_german_13b_v01 is a 13 billion parameter Llama2-based model, fine-tuned by jphme on a large dataset of German instructions. This model is specifically optimized for German language proficiency, excelling in understanding, generating, and interacting with German content. It is part of a family of models designed to provide robust performance for German-centric natural language processing tasks.

Loading preview...

EM German 13B v01 Overview

The EM German 13B v01 model, developed by jphme, is a 13 billion parameter language model built upon the Llama2 architecture. It has been extensively fine-tuned using a substantial dataset of German instructions, making it highly proficient in the German language. This model is part of a broader EM German family, which includes versions based on Llama2, Mistral, and LeoLM, all optimized for German text.

Key Capabilities

  • German Language Proficiency: Optimized for understanding, generating, and interacting with German language content.
  • Instruction Following: Fine-tuned on diverse German instructions to provide accurate and relevant responses.
  • Model Family: Part of a series of German-optimized models, offering various base architectures and sizes.

Good For

  • German NLP Applications: Ideal for tasks requiring strong German language comprehension and generation.
  • Research and Development: Suitable for exploring German-specific language models and their performance.
  • Custom German Solutions: Can serve as a foundation for building specialized applications that require robust German language capabilities.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p