NilanE/tinyllama-en_ja-translation-v2
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm
NilanE/tinyllama-en_ja-translation-v2 is a TinyLlama-based model developed by NilanE, specifically designed for Japanese-English translation. This model focuses on long-context translation, optimized for inputs between 500-1000 tokens. It provides deterministic outputs when configured with a temperature of 0, making it suitable for consistent translation tasks.
Loading preview...
Overview
NilanE/tinyllama-en_ja-translation-v2 is an in-progress translation model built upon the TinyLlama architecture, specializing in Japanese-to-English language conversion. It is particularly optimized for handling longer input contexts, with a recommended length of 500-1000 tokens for source text.
Key Capabilities
- Japanese-to-English Translation: Primary function is translating Japanese text into English.
- Long-Context Handling: Designed to process and translate longer passages of text, specifically within the 500-1000 token range.
- Deterministic Output: Users can achieve consistent and repeatable translation results by setting
do_sample = Falsein Hugging Face transformers or by setting the temperature to 0 during inference.
Usage
To use this model for translation, the recommended prompt format is:
"Translate this from Japanese to English:\n### JAPANESE: {source text} \n### ENGLISH: "Good For
- Developers requiring a specialized model for Japanese-English translation of moderately long texts.
- Applications where consistent and deterministic translation outputs are crucial.