Agnes-AI/Agnes-SeaLLM-8b
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 8, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Agnes-SeaLLM-8B is a compact 8 billion parameter Large Language Model developed by Agnes-AI, specifically optimized for Southeast Asian languages with a 32768 token context length. It delivers performance comparable to much larger models in mathematical reasoning, translation, and instruction following. The model is engineered to minimize hallucinations and provide culturally sensitive responses, excelling across multi-dimensional benchmarks including M3Exam and MMLU.

Loading preview...