manotham/Thai-dialogue-transalate_sft_80K

TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 28, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The manotham/Thai-dialogue-transalate_sft_80K is a 4 billion parameter Qwen3-based instruction-tuned language model developed by manotham, with a context length of 32768 tokens. This model is specifically fine-tuned for Thai dialogue translation tasks. It leverages Unsloth and Huggingface's TRL library for accelerated training. Its primary strength lies in handling dialogue-based translation within the Thai language.

Loading preview...

Overview

The manotham/Thai-dialogue-transalate_sft_80K is a 4 billion parameter language model based on the Qwen3 architecture, developed by manotham. It features a substantial context length of 32768 tokens, making it suitable for processing longer dialogue sequences. This model has been instruction-tuned, indicating its optimization for following specific commands and generating targeted responses.

Key Capabilities

  • Thai Dialogue Translation: The model is specifically fine-tuned for translating dialogue involving the Thai language, suggesting proficiency in understanding and generating conversational Thai.
  • Qwen3 Architecture: Built upon the Qwen3 foundation, it benefits from the general language understanding and generation capabilities inherent to this model family.
  • Efficient Training: The model was trained using Unsloth and Huggingface's TRL library, which enabled a 2x faster fine-tuning process.

Good For

  • Thai Language Applications: Ideal for use cases requiring translation or processing of conversational text in Thai.
  • Dialogue Systems: Suitable for integration into chatbots, virtual assistants, or other systems that handle spoken or written dialogue in Thai.
  • Research and Development: Provides a specialized base for further experimentation or fine-tuning on specific Thai dialogue datasets.