aisingapore/Llama-SEA-LION-v3-70B-IT
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 11, 2024License:llama3.1Architecture:Transformer0.0K Warm
Llama-SEA-LION-v3-70B-IT is a 70 billion parameter instruction-tuned decoder-only language model developed by AI Singapore, built upon the Llama 3.1 architecture. It is specifically pretrained and instruction-tuned for Southeast Asian languages, supporting Burmese, Chinese, English, Filipino, Indonesian, Javanese, Khmer, Lao, Malay, Sundanese, Tamil, Thai, and Vietnamese. With a 32,768 token context length, this model excels in general language understanding and instruction-following tasks across the SEA region.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–