ThaiLLM/ThaiLLM-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 3, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ThaiLLM/ThaiLLM-8B is a continued pre-trained base language model developed by ThaiLLM, built upon Qwen3-8B-Base. It was trained on a diverse corpus of approximately 63 billion tokens, with a significant portion dedicated to Thai language data. This model is specifically optimized for natural language understanding in Thai, demonstrating substantial improvements over its base model on various Thai-specific benchmarks. It serves as a foundational model requiring instruction fine-tuning for specific applications.

Loading preview...