Torsahaphat/thai-dialect_korat_model-merged
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 7, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The Torsahaphat/thai-dialect_korat_model-merged is a 3.1 billion parameter Qwen2-based instruction-tuned causal language model developed by Torsahaphat. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is specifically optimized for tasks related to the Thai Korat dialect, making it suitable for applications requiring localized Thai language understanding and generation.
Loading preview...
Model Overview
The Torsahaphat/thai-dialect_korat_model-merged is a 3.1 billion parameter language model based on the Qwen2 architecture. Developed by Torsahaphat, this model has been instruction-tuned to specialize in the Thai Korat dialect.
Key Characteristics
- Base Model: Fine-tuned from
unsloth/qwen2.5-3b-instruct-unsloth-bnb-4bit. - Training Efficiency: Leverages Unsloth and Huggingface's TRL library for 2x faster fine-tuning.
- Parameter Count: Features 3.1 billion parameters, offering a balance between performance and computational efficiency.
- Context Length: Supports a context length of 32768 tokens.
- License: Distributed under the Apache-2.0 license.
Use Cases
This model is particularly well-suited for applications requiring:
- Thai Korat Dialect Processing: Understanding, generating, or translating text in the Korat dialect of Thai.
- Localized Content Creation: Developing content or chatbots tailored for speakers of the Korat dialect.
- Research in Dialectal NLP: Studying and experimenting with regional linguistic variations within the Thai language.