infly/INFIndo-Qwen3-32B-Preview
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:May 25, 2025License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

The INFIndo-Qwen3-32B-Preview is a 32 billion parameter language model developed by Infly, based on the Qwen3 architecture, with a 32768 token context length. It is specifically enhanced for Indonesian language capabilities, utilizing a proprietary Indonesian language corpus and a continual fine-tuning pipeline. This model has achieved top performance among open-source models on the Indonesian task in SEA-HELM, making it ideal for applications requiring strong Indonesian language understanding and generation.

Loading preview...

Overview

INFIndo-Qwen3-32B-Preview is a 32 billion parameter language model from Infly, built upon the Qwen3 architecture. It features a 32768 token context length and is specifically optimized for the Indonesian language through a dedicated data production pipeline and proprietary Indonesian corpus.

Key Capabilities

  • Enhanced Indonesian Language Performance: Achieves top performance among open-source models on the Indonesian task in SEA-HELM, scoring 72.8.
  • Continual Fine-tuning: Benefits from an effective continual fine-tuning pipeline that validates its improved Indonesian language capabilities.
  • Strong Baseline: Built on the robust Qwen3-32B model, further improving its performance for Indonesian-specific tasks.

Evaluation Highlights

As of May 25th, 2025, INFIndo-Qwen3-32B-Preview surpasses other prominent models like Qwen3-32B (71.9), DeepSeek-R1 (72.1), and Llama 3.3 (70.4) on the SEA-HELM (ID) benchmark, demonstrating its leading position for Indonesian language evaluation.

Good For

  • Applications requiring high-accuracy Indonesian language processing.
  • Developers seeking a powerful open-source model for Indonesian NLP tasks.
  • Research and development in Indonesian language understanding and generation.