akoumpa/Devstral-Small-2-24B-Instruct-2512-BF16
Overview
Devstral Small 2 24B Instruct 2512: Agentic LLM for Software Engineering
Developed by Mistral AI, Devstral Small 2 is a 24 billion parameter instruction-tuned language model specifically engineered for agentic software engineering tasks. It is designed to excel at using tools for codebase exploration, multi-file editing, and powering sophisticated software engineering agents. The model features a substantial 256k context window and has been updated with vision capabilities, allowing it to analyze images in addition to text.
Key Capabilities
- Agentic Coding: Optimized for complex software engineering workflows, enabling deep codebase understanding and tool integration.
- Lightweight Deployment: At 24B parameters, it's efficient enough to run locally on consumer-grade hardware like an RTX 4090 or a Mac with 32GB RAM.
- Vision Capabilities: Processes and provides insights based on visual content, enhancing its utility for diverse engineering tasks.
- Improved Performance: Demonstrates enhanced generalization and performance compared to its predecessors, achieving 68.0% on SWE-bench Verified.
- Apache 2.0 License: Offers flexibility for both commercial and non-commercial use.
Good For
- AI Code Assistants: Powering intelligent assistants that interact with codebases.
- Agentic Coding: Developing autonomous agents for software development.
- Software Engineering Tasks: Automating and assisting with complex coding environments and deep codebase understanding.