mesolitica/malaysian-llama-3-8b-instruct-16k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 27, 2024Architecture:Transformer0.0K Cold
The mesolitica/malaysian-llama-3-8b-instruct-16k is an 8 billion parameter instruction-tuned causal language model developed by mesolitica. It is a full parameter finetuning of Llama-3, specifically optimized for Malaysian chat completion tasks with a 16,384 token context length. This model excels at understanding and generating responses in the Malaysian language, making it suitable for applications requiring deep linguistic and cultural understanding of Malaysia.
Loading preview...