MiniMax M2.5 Now Available on Featherless!
MiniMax M2.5 is now live on Featherless. It's a frontier-class reasoning model that tops SWE-Bench Verified at 80.2%, available now through our API.

We're excited to announce that MiniMax M2.5 is now available on Featherless. This is one of the most significant model releases, we've seen a genuine frontier model that delivers exceptional coding and agentic capabilities while being remarkably cost-efficient.
Why M2.5 Matters
MiniMax has been on an impressive trajectory. In the last couple of months, they've released M2, M2.1, and now M2.5, with improvements outpacing Claude, GPT, and Gemini model families on key benchmarks.
The headline numbers speak for themselves:
Benchmark | M2.5 Score |
|---|---|
SWE-Bench Verified | 80.2% |
Multi-SWE-Bench | 51.3% |
BrowseComp | 76.3% |
On the Droid harness, M2.5 scores 79.7% vs Claude Opus 4.6's 78.9%. On OpenCode: 76.1% vs 75.9%.
Built for Real Coding Work
What sets M2.5 apart from other reasoning models is how it approaches development tasks. The model exhibits what MiniMax calls "spec-writing tendency", before writing code, it actively decomposes and plans features, structure, and UI design like an experienced software architect.
M2.5 was trained across 10+ languages (Go, C, C++, TypeScript, Rust, Kotlin, Python, Java, JavaScript, PHP, Lua, Dart, Ruby) in over 200,000 real-world environments.
Why This Matters for Featherless Users
If you're building AI-powered applications, coding assistants, or agentic workflows, M2.5 gives you frontier-level performance without the frontier-level bill. Combined with Featherless's flat-rate plans starting at $25/month, you can experiment and iterate without token anxiety.
M2.5 is particularly compelling for:
AI coding tools (Cline, Roo Code) where reasoning quality directly impacts output
Agentic applications that need reliable tool calling and search
Batch processing where cost efficiency scales with volume
Prototyping where you want to test frontier capabilities before committing
Get Started
M2.5 is available now through the Featherless API. It's OpenAI-compatible, so swapping in is straightforward:
from openai import OpenAI
client = OpenAI(
base_url="https://api.featherless.ai/v1",
api_key="your-featherless-api-key"
)
response = client.chat.completions.create(
model="MiniMaxAI/MiniMax-M2.5",
messages=[
{"role": "user", "content": "Your prompt here"}
]
)Check out our model catalog to explore M2.5 alongside our 24,000+ other available models.
Have questions or want to share what you're building with M2.5? Join us on Discord.