mesolitica/Malaysian-Qwen2.5-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 3, 2025Architecture:Transformer0.0K Cold

The mesolitica/Malaysian-Qwen2.5-7B-Instruct is a 7.6 billion parameter instruction-tuned causal language model, fine-tuned by mesolitica from the Qwen2.5-7B-Instruct base model. It specializes in understanding and generating content in various Malaysian languages and dialects, including Mandarin, Tamil, Jawi, and multiple regional variations. This model excels at handling multi-turn Malaysian contexts related to legislation, politics, religions, and local languages, and can also generate code in these languages.

Loading preview...