TeichAI/Devstral-Small-2505-Deepseek-V3.2-Speciale-Distill
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 4, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TeichAI/Devstral-Small-2505-Deepseek-V3.2-Speciale-Distill is a 24 billion parameter Mistral-based language model developed by TeichAI, fine-tuned from unsloth/devstral-small-2505. This model was trained with Unsloth and Huggingface's TRL library, emphasizing faster training. With a 32768 token context length, it is optimized for applications requiring efficient processing of longer sequences. Its fine-tuned nature suggests specialized performance in areas related to its base model's strengths, enhanced by the training methodology.

Loading preview...