unsloth/Mistral-Small-3.2-24B-Instruct-2506
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 20, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Mistral-Small-3.2-24B-Instruct-2506 is a 24 billion parameter instruction-tuned language model developed by Mistral AI, building upon the Mistral-Small-3.1 series. This model features a 32768-token context length and is specifically enhanced for improved instruction following, reduced repetition errors, and more robust function calling. It excels in complex reasoning tasks, code generation, and vision capabilities, making it suitable for applications requiring precise control and multimodal understanding.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p