chutesai/Mistral-Small-3.2-24B-Instruct-2506
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 21, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

Mistral-Small-3.2-24B-Instruct-2506 is a 24 billion parameter instruction-tuned language model developed by Mistral AI, building upon the Mistral-Small-3.1-24B-Instruct-2503 series. This model features improved instruction following, reduced repetition errors, and a more robust function calling template. It also supports multimodal inputs, including vision, and is optimized for general instruction-following tasks with a 32K context length.

Loading preview...