elyza/ELYZA-japanese-Llama-2-13b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Dec 25, 2023License:llama2Architecture:Transformer0.0K Open Weights Warm

ELYZA-japanese-Llama-2-13b is a 13 billion parameter language model developed by ELYZA, based on the Llama 2 architecture. It has undergone additional pre-training to enhance its Japanese language capabilities. This model is specifically optimized for processing and generating text in Japanese, making it suitable for applications requiring strong performance in that language.

Loading preview...

ELYZA-japanese-Llama-2-13b Overview

ELYZA-japanese-Llama-2-13b is a 13 billion parameter language model developed by ELYZA, built upon the Llama 2 foundation. Its primary distinction lies in its extensive additional pre-training specifically designed to significantly improve its proficiency in the Japanese language. This makes it a specialized tool for tasks where robust Japanese language understanding and generation are critical.

Key Capabilities

  • Enhanced Japanese Language Processing: The model has undergone supplementary pre-training to extend Llama 2's capabilities for Japanese text.
  • Llama 2 Architecture: Benefits from the robust and widely recognized Llama 2 base architecture.
  • Multiple Variants: ELYZA offers several related models, including ELYZA-japanese-Llama-2-13b-instruct for instruction-following, and ELYZA-japanese-Llama-2-13b-fast and ELYZA-japanese-Llama-2-13b-fast-instruct with larger vocabularies for potentially faster or more nuanced processing.

Use Cases

This model is particularly well-suited for applications requiring high-quality Japanese language understanding and generation. Developers can leverage it for tasks such as:

  • Japanese text generation and completion.
  • Building chatbots or conversational AI systems in Japanese.
  • Content creation and summarization for Japanese language materials.
  • Research and development in Japanese natural language processing.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p