EntermindAI/Rukun-32B-V
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026License:otherArchitecture:Transformer0.0K Cold

EntermindAI/Rukun-32B-V is a 32 billion parameter language model built on Qwen/Qwen2.5-32B-Instruct, fine-tuned with LoRA for structured validation of content against Malaysia's Rukun Negara principles. This model specializes in returning strict JSON outputs with principle-level scoring, severity assessment, and explanation for policy compliance. It supports Bahasa Malaysia, English, and code-switched input, making it ideal for automated, localized content moderation and policy assessment.

Loading preview...