Sahabat-AI/gemma2-9b-cpt-sahabatai-v1-base
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:May 30, 2025License:gemmaArchitecture:Transformer0.0K Cold
The Sahabat-AI/gemma2-9b-cpt-sahabatai-v1-base is a 9 billion parameter decoder-only language model, developed by PT GoTo Gojek Tokopedia Tbk and AI Singapore, built upon the Gemma2 9B CPT SEA-Lionv3 base. It has undergone continued pre-training on approximately 50 billion tokens, specifically optimized for Indonesian language and its various dialects, including Javanese and Sundanese. This model excels in general language capabilities across these Southeast Asian languages, making it suitable for applications requiring strong performance in the Indonesian linguistic context.
Loading preview...