Mistral-Small-3.1-24B-Instruct-2503 is a 24 billion parameter instruction-finetuned model by Mistral AI, building upon Mistral Small 3. It features state-of-the-art vision understanding and an extended 128k token context window, alongside strong text performance. This model is optimized for fast-response conversational agents, low-latency function calling, and local inference, supporting a wide range of multilingual and advanced reasoning tasks.
No reviews yet. Be the first to review!