Model Overview
This model, llmfan46/Mistral-Small-3.2-24B-Instruct-2506-ultra-uncensored-heretic, is a decensored variant of Mistral AI's Mistral-Small-3.2-24B-Instruct-2506. It was developed by llmfan46 using the Heretic v1.2.0 tool, specifically employing the Arbitrary-Rank Ablation (ARA) method to modify the original model's behavior.
Key Differentiators
- Significantly Reduced Refusals: Achieves a 98% reduction in content refusals, with only 2 out of 100 prompts resulting in refusal, compared to 98 out of 100 for the original model. This makes it highly suitable for use cases requiring less restrictive content generation.
- Quality Preservation: Despite decensoring, the model maintains high quality, exhibiting a low KL divergence of 0.0369 from the original model. This indicates that its core capabilities and coherence are largely preserved.
Core Capabilities (inherited from Mistral-Small-3.2-24B-Instruct-2506)
- Enhanced Instruction Following: Improves upon its predecessor in accurately following precise instructions.
- Reduced Repetition Errors: Produces fewer infinite generations or repetitive answers, leading to more concise and relevant outputs.
- Robust Function Calling: Features a more robust function calling template, excelling in tool-use tasks.
- Vision Reasoning: Capable of processing and reasoning with image inputs, as demonstrated by examples involving image analysis for decision-making.
Performance Highlights
- Instruction Following: Achieves 65.33% on Wildbench v2 and 43.1% on Arena Hard v2, showing significant improvements over the 3.1 version.
- STEM Benchmarks: Demonstrates strong performance in STEM tasks, with 69.06% on MMLU Pro (5-shot CoT), 78.33% on MBPP Plus - Pass@5, and 92.90% on HumanEval Plus - Pass@5.
Ideal Use Cases
This model is particularly well-suited for developers and applications that require a powerful, instruction-tuned language model with minimal content restrictions, while still benefiting from strong instruction following, function calling, and multimodal (vision) capabilities.