URajinda/ShweYon-Qwen2.5-Burmese-1.5B-v1.2
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 30, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

URajinda/ShweYon-Qwen2.5-Burmese-1.5B-v1.2 is a 1.5 billion parameter language model based on the Qwen2.5 architecture, specifically optimized for the Myanmar (Burmese) language. This model features a significant vocabulary expansion designed to address common tokenization inefficiencies in Burmese natural language processing. It is developed by URajinda and focuses on enhancing Burmese language understanding and generation through continual pre-training.

Loading preview...