aisingapore/Llama-SEA-LION-v3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 11, 2024License:llama3.1Architecture:Transformer0.0K Warm

Llama-SEA-LION-v3-8B is an 8 billion parameter multilingual decoder-only large language model developed by AI Singapore, based on the Llama 3.1 architecture. It has undergone continued pre-training on approximately 200 billion tokens across 11 Southeast Asian languages, including Burmese, Chinese, English, Filipino, Indonesian, Khmer, Lao, Malay, Tamil, Thai, and Vietnamese. This model is specifically designed for the Southeast Asia region, excelling in general language capabilities and constraint-following behavior across these diverse languages, with a context length of 32768 tokens. Its primary strength lies in its specialized multilingual support and performance in SEA-specific linguistic tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p