Outlier-Ai/Outlier-10B-V2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
Outlier-Ai/Outlier-10B-V2 is a 10.6 billion parameter ternary-quantized Mixture-of-Experts (MoE) language model, built upon the Qwen2.5-7B-Instruct base. This model achieves 99.1% teacher retention on MMLU, scoring 75.96%, through lightweight Context-Aware KL Divergence (CAKLD) distillation. It is designed for efficient inference with approximately 7.4 billion active parameters per token and an inference RAM footprint of around 5 GB, making it suitable for applications requiring strong performance within memory constraints.
Loading preview...