darkc0de/Mistral-Small-3.2-24B-Instruct-2506-heretic
VISIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The darkc0de/Mistral-Small-3.2-24B-Instruct-2506-heretic is a 24 billion parameter instruction-tuned language model, a decensored version of Mistral-Small-3.2-24B-Instruct-2506 created using Heretic v1.2.0. It features a 32768 token context length and significantly reduces refusals compared to its original counterpart (4/100 vs 98/100). This model excels in instruction following, reducing repetition errors, and robust function calling, making it suitable for applications requiring less restrictive content generation and improved reliability.

Loading preview...