URajinda/Qwen-1.5B-Burmese-SFT-v2
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 11, 2025License:apache-2.0Architecture:Transformer Open Weights Loading
URajinda/Qwen-1.5B-Burmese-SFT-v2 is a 1.5 billion parameter language model developed by URajinda, based on the Alibaba Cloud Qwen-1.5-1.8B architecture. This model has been instruction-tuned (SFT) specifically for the Burmese language, excelling in question answering and instruction following tasks in Burmese. It supports a context length of 32768 tokens, making it suitable for processing longer Burmese texts.
Loading preview...