URajinda/Qwen-1.5B-Burmese-SFT-v2

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 11, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

URajinda/Qwen-1.5B-Burmese-SFT-v2 is a 1.5 billion parameter language model developed by URajinda, based on the Alibaba Cloud Qwen-1.5-1.8B architecture. This model has been instruction-tuned (SFT) specifically for the Burmese language, excelling in question answering and instruction following tasks in Burmese. It supports a context length of 32768 tokens, making it suitable for processing longer Burmese texts.

Loading preview...

Overview

URajinda/Qwen-1.5B-Burmese-SFT-v2 is a specialized large language model (LLM) built upon the Alibaba Cloud Qwen-1.5-1.8B foundational model. It has undergone Instruction-Tuning (SFT) with a focus on the Burmese language.

Key Capabilities

  • Burmese Language Proficiency: Specifically fine-tuned to understand and generate text in Burmese.
  • Question Answering (QA): Designed to perform well in answering questions posed in Burmese.
  • Instruction Following: Capable of adhering to instructions provided in Burmese.
  • Chat Format: Trained to interact using a chat-like structure, expecting User: and Assistant: tags for prompts.

Use Cases

This model is particularly well-suited for applications requiring natural language processing in Burmese, such as:

  • Developing Burmese-language chatbots.
  • Automated customer support systems in Burmese.
  • Content generation or summarization for Burmese text.
  • Educational tools for Burmese speakers.