aisingapore/Gemma-SEA-LION-v4-27B-IT
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Aug 11, 2025License:gemma Vision Architecture:Transformer0.0K Warm

Gemma-SEA-LION-v4-27B-IT is a 27 billion parameter decoder-only large language model developed by AI Singapore, based on the Gemma 3 architecture. It is post-trained and instruction-tuned for Southeast Asian (SEA) languages, including Burmese, English, Indonesian, Khmer, Lao, Malay, Tagalog, Tamil, Thai, and Vietnamese, with a large 128K token context length. This model excels at SEA-specific tasks and offers image and text understanding capabilities, alongside advanced function calling.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p