Sahabat-AI/Llama-Sahabat-AI-v2-70B-IT
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:May 30, 2025License:llama3.1Architecture:Transformer0.0K Warm

Sahabat-AI/Llama-Sahabat-AI-v2-70B-IT is a 70 billion parameter decoder-only large language model developed by PT GoTo Gojek Tokopedia Tbk and AI Singapore, instruct-tuned for Indonesian languages. Utilizing the Llama 3.1 tokenizer, it supports English, Indonesian, Javanese, Sundanese, Batak Toba, and Balinese, with a context length of 128k tokens. This model is specifically optimized for local Indonesian contexts and multilingual instruction-following, evaluated on benchmarks like IndoMMLU and SEA-HELM.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p