mistralai/Devstral-Small-2507

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face
Overview

Devstral Small 1.1: An Agentic LLM for Software Engineering

Devstral Small 1.1 is a 24 billion parameter agentic large language model, a collaboration between Mistral AI and All Hands AI. Fine-tuned from Mistral-Small-3.1, it is specifically designed for software engineering tasks, emphasizing tool use for codebase exploration and multi-file editing. The model features a 128k token context window and utilizes a Tekken tokenizer with a 131k vocabulary size. It is lightweight enough to run on a single RTX 4090 or a Mac with 32GB RAM, making it suitable for local and on-device deployment under an Apache 2.0 License.

Key Capabilities

  • Agentic Coding: Designed to excel in agentic coding tasks, making it ideal for software engineering agents.
  • High Performance on SWE-Bench: Achieves a 53.6% score on SWE-Bench Verified, outperforming other state-of-the-art models by a significant margin, including larger alternatives when evaluated under the same OpenHands scaffold.
  • Tool Calling: Supports Mistral's function calling format, enhancing its ability to interact with external tools and environments.
  • Efficient Local Deployment: Its compact size allows for efficient local inference using libraries like vLLM, mistral-inference, and transformers.

Good for

  • Software Engineering Agents: Its agentic design and strong performance on SWE-Bench make it highly suitable for automating and assisting in software development workflows.
  • Codebase Exploration and Editing: Excels at navigating and modifying codebases across multiple files.
  • Local Development Environments: Its lightweight nature and support for various local inference frameworks enable developers to run it on consumer-grade hardware.
  • OpenHands Integration: Recommended for use with the OpenHands scaffold for optimal performance in agentic tasks, as demonstrated by examples like analyzing test coverage and building interactive web games.