typhoon-ai/typhoon-s-thaillm-8b-instruct-research-preview
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 17, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Typhoon-S-ThaiLLM-8B-Instruct is an 8 billion parameter instruction-tuned Thai and English large language model developed by Typhoon-AI, based on the Qwen3 architecture and the ThaiLLM base model. It features a 32K context length and focuses on openness and reproducibility, with its training dataset, code, and technical report publicly available. This model demonstrates an approach to building competitive instruction models for local languages with academic budgets, aiming to democratize post-training from base models for sovereign AI.

Loading preview...