RedHatAI/Mistral-Small-3.1-24B-Instruct-2503 is a 24 billion parameter instruction-finetuned language model developed by Mistral AI, building upon Mistral Small 3.1 (2501). It features state-of-the-art vision understanding and enhanced long context capabilities up to 128k tokens, while maintaining strong text performance. This model excels in fast-response conversational agents, low-latency function calling, and local inference for sensitive data, supporting a 32768 token context window.
No reviews yet. Be the first to review!