grayarea/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 21, 2026Architecture:Transformer0.0K Cold

grayarea/Mistral-Small-3.2-24B-Instruct-2506-Text-Only is a 24 billion parameter instruction-tuned language model based on the Mistral architecture, featuring a 32768 token context length. This variant is specifically designed as a text-only model, removing the vision encoder present in other versions. It is optimized for general text generation and instruction following tasks.

Loading preview...