MaLA-LM/emma-500-llama3.1-8b-bi
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 10, 2025License:llama3Architecture:Transformer0.0K Cold
The EMMA-500 Llama 3.1 8B is an 8 billion parameter multilingual language model developed by MaLA-LM, continually pre-trained on the Llama 3.1 8B architecture. It supports 546 languages with substantial training data, leveraging the MaLA Corpus which includes bilingual translation data across 2,500+ language pairs. This model excels in multilingual tasks such as commonsense reasoning, machine translation, and text classification, particularly for low-resource languages.
Loading preview...