lactroiii/Dolphin3.0-R1-Mistral-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Cold

Dolphin 3.0 R1 Mistral 24B is a 24 billion parameter instruct-tuned model from the Dolphin 3.0 Collection, curated and trained by Eric Hartford, Ben Gitter, BlouseJury, and Cognitive Computations. This model is designed as a general-purpose local model, excelling in coding, math, agentic tasks, and function calling, with a 32768 token context length. It has been trained for 3 epochs using 800k reasoning traces from the Dolphin-R1 dataset, focusing on general-purpose reasoning and user steerability.

Loading preview...