aitfindonesia/KomdigiUB-8B-Base
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 10, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
KomdigiUB-8B-Base by Tim 1 AITF is an 8 billion parameter Indonesian causal language model built on the Qwen3-8B architecture, optimized for Continued Pre-Training (CPT) in digital policy and oversight domains. It utilizes LoRA and 4-bit quantization for efficiency and is specifically designed for adapting to Indonesian public policy and digital regulation contexts. The model demonstrates a validation perplexity of ~3.55 and achieves ~65.66 on IndoMMLU, making it suitable for domain-specific knowledge enrichment and pre-adaptation before further fine-tuning.
Loading preview...