Tsunami-th/Tsunami-1.0-14B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Oct 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Tsunami-1.0-14B-Instruct is a 14 billion parameter Thai Large Language Model developed by Pollakrit Lorprasertkul, fine-tuned from Qwen2.5-14B on a Thai dataset. This model is specifically optimized for Thai language understanding and generation, demonstrating superior performance on Thai-centric benchmarks like Thai Exam and M3Exam compared to other models in its class.

Loading preview...