KickItLikeShika/llama-3.3-70B-Instruct-tatoeba-en-tt
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 25, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
KickItLikeShika/llama-3.3-70B-Instruct-tatoeba-en-tt is a 70 billion parameter Llama 3.3 instruction-tuned model developed by KickItLikeShika. Fine-tuned from unsloth/llama-3.3-70b-instruct-unsloth-bnb-4bit, this model specializes in English to Tatar machine translation. It was specifically trained for low-resource machine translation tasks, offering a context length of 32768 tokens.
Loading preview...