NousResearch/DeepHermes-3-Mistral-24B-Preview
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 2, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

DeepHermes 3 - Mistral 24B Preview by Nous Research is a 24 billion parameter language model with a 32768 token context length, uniquely designed to unify both intuitive and long chain-of-thought reasoning modes. This model excels at complex problem-solving by allowing users to toggle deep reasoning via a system prompt, alongside improved LLM annotation, judgment, and function calling capabilities. It is optimized for advanced agentic tasks, multi-turn conversations, and enhanced user steerability.

Loading preview...