MagicalAlchemist/Llama-SEA-LION-v3-8B-IT-Magic_decensored
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 25, 2026License:llama3.1Architecture:Transformer0.0K Cold

MagicalAlchemist/Llama-SEA-LION-v3-8B-IT-Magic_decensored is an 8 billion parameter decoder-only language model, based on the Llama 3.1 architecture, with a 32768 token context length. It is a decensored version of the AI Singapore Llama-SEA-LION-v3-8B-IT model, specifically modified to reduce refusal rates. This model is instruction-tuned for Southeast Asian languages, supporting Burmese, Chinese, English, Filipino, Indonesian, Javanese, Khmer, Lao, Malay, Sundanese, Tamil, Thai, and Vietnamese, making it suitable for multilingual applications in the SEA region.

Loading preview...