mesolitica/Malaysian-Llama-3.1-8B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:May 3, 2025Architecture:Transformer Cold

The mesolitica/Malaysian-Llama-3.1-8B-Instruct is an 8 billion parameter instruction-tuned language model developed by Mesolitica, fine-tuned from Meta's Llama-3.1-8B-Instruct. It is specifically optimized for understanding and generating responses in various Malaysian languages and dialects, including Mandarin, Tamil, Jawi, and multiple regional Malaysian dialects. This model excels in handling multi-turn Malaysian contexts related to legislation, politics, religions, and local languages, making it suitable for applications requiring deep cultural and linguistic understanding of Malaysia.

Loading preview...