unsloth/Magistral-Small-2506
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 10, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Magistral-Small-2506 is a 24 billion parameter language model developed by unsloth, built upon Mistral Small 3.1 with enhanced reasoning capabilities. It undergoes Supervised Fine-Tuning (SFT) from Magistral Medium traces and Reinforcement Learning (RL). This model is designed for efficient reasoning tasks, supporting dozens of languages, and can be deployed locally on hardware like an RTX 4090 or a 32GB RAM MacBook when quantized. It features a 128k context window, with optimal performance recommended up to 40k tokens.

Loading preview...