Qwen-SEA-LION-v4-32B-IT is a 32 billion parameter instruction-tuned decoder-only large language model developed by AI Singapore. Based on the Qwen3 architecture, it underwent continued pre-training on 100 billion tokens from the SEA-Pile v2 corpus, specifically targeting seven Southeast Asian languages. This model is optimized for multilingual understanding and generation within the Southeast Asian context, supporting a 32K token context length.
No reviews yet. Be the first to review!