kwoncho/Llama-3.2-3B-KO-EN-Translation

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kArchitecture:Transformer0.0K Warm

kwoncho/Llama-3.2-3B-KO-EN-Translation is a 3.2 billion parameter model developed by Hyunkwon Cho, specifically fine-tuned for Korean-English translation. It addresses common errors in Korean-English translation found in base Llama 3.2 models, particularly excelling in sentence-level translation for financial and business contexts. The model was trained on approximately 470,000 financial-related sentence pairs and an additional 500,000 general sentence pairs.

Loading preview...

Overview

kwoncho/Llama-3.2-3B-KO-EN-Translation is a 3.2 billion parameter model developed by Hyunkwon Cho, specifically trained to improve Korean-English translation, addressing limitations observed in base Llama 3.2 models. It has been fine-tuned using a substantial dataset of approximately 470,000 financial-related sentence pairs and an additional 500,000 general sentence pairs.

Key Capabilities

  • Specialized Translation: Optimized for Korean-English translation, particularly within financial and business domains.
  • Sentence-Level Accuracy: Primarily trained for sentence-unit translation.
  • Contextual Training: Utilizes a large corpus of domain-specific and general Korean-English sentence pairs.

Limitations and Considerations

  • Long Text: May exhibit errors when translating long texts or inputs with line breaks, as it was trained on sentence-level units.
  • Numerical Accuracy: Users should consider adding prompts like "Use the numbers from the original text for percentages, amounts, and figures" to prevent the model from generating incorrect numerical values.
  • English-Korean Performance: The model's performance for English-to-Korean translation is noted to be significantly weaker compared to Korean-to-English translation.