aisingapore/Gemma-SEA-LION-v3-9B-IT
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Oct 30, 2024License:gemmaArchitecture:Transformer0.0K Warm

Gemma-SEA-LION-v3-9B-IT is a 9 billion parameter instruction-tuned decoder-only large language model developed by AI Singapore, based on the Gemma2 architecture. It is specifically pretrained and instruct-tuned for the Southeast Asia (SEA) region, supporting 13 languages including Burmese, Chinese, English, Filipino, Indonesian, Javanese, Khmer, Lao, Malay, Sundanese, Tamil, Thai, and Vietnamese. With a context length of 8192 tokens, this model excels at instruction-following tasks in a multilingual Southeast Asian context.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p