nectec/thai-research-gemma-3-27b-it
nectec/thai-research-gemma-3-27b-it, also known as Pathumma LLM AI, is a 27 billion parameter instruction-tuned language model developed by the National Electronics and Computer Technology Center (NECTEC) of Thailand. It is a fine-tuned version of Google's Gemma 3 27B, specifically optimized for the Thai language through continued pre-training on 8 billion Thai tokens and instruction fine-tuning on over 3 million Thai question-answer pairs. This model excels at understanding and generating text in Thai, making it ideal for Thai-centric conversational AI and natural language processing applications.
Loading preview...
Pathumma LLM AI: Thai-Specialized Gemma 3 27B
nectec/thai-research-gemma-3-27b-it, or Pathumma LLM AI, is a 27 billion parameter language model developed by the National Electronics and Computer Technology Center (NECTEC) of Thailand. It is built upon Google's Gemma 3 27B base model, undergoing extensive specialization for the Thai language.
Key Capabilities
- Thai Language Proficiency: Continuously pre-trained on approximately 8 billion Thai tokens, significantly enhancing its understanding and generation capabilities in Thai.
- Instruction Following: Instruction fine-tuned on over 3 million high-quality Thai question-answer pairs, enabling it to follow complex instructions and engage in natural conversations.
- Large Context Window: Supports an input context of 128K tokens and generates outputs up to 8192 tokens, suitable for processing lengthy Thai texts.
Good For
- Thai Conversational AI: Developing chatbots, virtual assistants, and interactive applications that require deep understanding and fluent generation of Thai.
- Thai Natural Language Processing: Tasks such as summarization, question answering, and content generation specifically for the Thai language.
- Research and Development: As a robust foundation for further fine-tuning or research into Thai language models.