MaLA-LM/emma-500-llama3.1-8b-mono
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 10, 2025License:llama3Architecture:Transformer Cold

MaLA-LM/emma-500-llama3.1-8b-mono is an 8 billion parameter multilingual language model developed by MaLA-LM, continually pre-trained on the Llama 3.1 architecture. It supports 546 languages with substantial training data, leveraging the MaLA Corpus which includes books, code, instruction data, and papers. This model excels in massively multilingual NLP tasks such as commonsense reasoning, machine translation, and text classification, making it suitable for applications requiring broad language coverage.

Loading preview...