aisingapore/Llama-SEA-Guard-8B-2602
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 2, 2025License:llama3.1Architecture:Transformer0.0K Cold

Llama-SEA-Guard-8B-2602 is an 8 billion parameter decoder-only language model developed by AI Singapore, fine-tuned from Llama-SEA-LION-v3-8B-IT. It is specifically designed for safety classification in Southeast Asian contexts, supporting Burmese, English, Indonesian, Malay, Tagalog, Tamil, Thai, and Vietnamese with a 128k token context length. Its primary use case is to classify user requests and AI responses as "safe" or "unsafe" for direct application without further fine-tuning.

Loading preview...