unsloth/Mistral-Small-3.1-24B-Base-2503
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 18, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The Mistral-Small-3.1-24B-Base-2503 is a 24 billion parameter base model developed by Mistral AI, building upon the Mistral Small 3 architecture. It features state-of-the-art vision understanding and an extended 128k token context window, alongside strong multilingual text capabilities. This model is designed for advanced multimodal applications requiring both visual and extensive textual analysis.

Loading preview...