KickItLikeShika/llama-3.3-70B-Instruct-tatoeba-en-tt

TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 25, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

KickItLikeShika/llama-3.3-70B-Instruct-tatoeba-en-tt is a 70 billion parameter Llama 3.3 instruction-tuned model developed by KickItLikeShika. Fine-tuned from unsloth/llama-3.3-70b-instruct-unsloth-bnb-4bit, this model specializes in English to Tatar machine translation. It was specifically trained for low-resource machine translation tasks, offering a context length of 32768 tokens.

Loading preview...

Model Overview

KickItLikeShika/llama-3.3-70B-Instruct-tatoeba-en-tt is a specialized 70 billion parameter language model, fine-tuned from the Llama 3.3 instruction-tuned architecture. Developed by KickItLikeShika, its primary function is high-quality machine translation from English to Tatar.

Key Capabilities

  • English to Tatar Translation: This model is specifically optimized for translating text from English into Tatar, leveraging its Llama 3.3 base and targeted fine-tuning.
  • Low-Resource Language Support: It was developed as part of the Low Resource Machine Translation Workshop (EACL26), indicating its focus on improving translation for languages with limited available data.
  • Large Context Window: With a context length of 32768 tokens, it can handle substantial input texts for translation, maintaining coherence over longer passages.

Good For

  • Academic Research: Ideal for researchers and linguists working on low-resource machine translation, particularly for the Tatar language.
  • Specific Translation Needs: Suitable for applications requiring accurate and context-aware English to Tatar translation.
  • Integration into MT Systems: Can serve as a core component in systems designed to bridge communication gaps involving Tatar.