coder3101/Mistral-Small-3.2-24B-Instruct-2506-heretic
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The coder3101/Mistral-Small-3.2-24B-Instruct-2506-heretic is a 24 billion parameter instruction-tuned language model, derived from Mistral-Small-3.2-24B-Instruct-2506, with a 32K context length. This model has been decensored using the Heretic v1.1.0 tool, significantly reducing refusals compared to the original. It excels in instruction following, function calling, and vision reasoning, while also mitigating repetition errors.

Loading preview...