aisingapore/WangchanLION-v3
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jul 9, 2025License:llama3.1Architecture:Transformer0.0K Cold

WangchanLION-v3 is an 8 billion parameter decoder-only language model developed by AI Singapore and VISTEC, built upon the Llama3 architecture. It is continually pre-trained on 47.4 billion Thai samples, making it specialized for Southeast Asian languages, particularly Thai. With a 128k context length, this model is designed for supervised fine-tuning (SFT) in Thai language applications.

Loading preview...