mistralai/Mistral-Small-3.2-24B-Instruct-2506
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 19, 2025License:apache-2.0Architecture:Transformer0.6K Open Weights Warm
Mistral-Small-3.2-24B-Instruct-2506 is a 24 billion parameter instruction-tuned language model developed by Mistral AI, building upon Mistral-Small-3.1. This model significantly improves instruction following, reduces repetition errors, and features a more robust function calling template. It maintains strong performance across STEM benchmarks and offers multimodal capabilities, making it suitable for complex reasoning tasks and applications requiring precise control.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p