Sahabat-AI/llama3-8b-cpt-sahabatai-v1-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 30, 2025License:llama3Architecture:Transformer0.0K Cold

The Sahabat-AI/llama3-8b-cpt-sahabatai-v1-instruct is an 8 billion parameter instruction-tuned causal language model developed by PT GoTo Gojek Tokopedia Tbk and AI Singapore, based on the Llama3 architecture with an 8192-token context length. This model is specifically optimized for the Indonesian language and its dialects, including Javanese and Sundanese, through extensive fine-tuning on approximately 448,000 Indonesian, 96,000 Javanese, and 98,000 Sundanese instruction-completion pairs. It excels in multilingual instruction-following within the Southeast Asian context, demonstrating strong performance on SEA HELM and IndoMMLU benchmarks for Indonesian, Javanese, and Sundanese tasks.

Loading preview...