yentinglin/Llama-3-Taiwan-70B-Instruct is a 70 billion parameter instruction-tuned language model developed by yentinglin, based on the Llama-3 architecture. It is fine-tuned on a large corpus of Traditional Mandarin and English data, excelling in language understanding, generation, reasoning, and multi-turn dialogue for these languages. The model features an 8K context length and demonstrates strong performance on various Traditional Mandarin NLP benchmarks, including legal, manufacturing, medical, and electronics domains.
No reviews yet. Be the first to review!