darkc0de/XortronCriminalComputingConfig-heretic
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

darkc0de/XortronCriminalComputingConfig-heretic is a 24 billion parameter, 32K context length language model, derived from darkc0de/XortronCriminalComputingConfig using the Heretic v1.1.0 abliteration tool. This model is specifically engineered for uncensored performance, demonstrating a significantly reduced refusal rate compared to its original counterpart. It excels in generating content that typically falls outside the guardrails of standard LLMs, making it suitable for use cases requiring unrestricted output.

Loading preview...

Model Overview

darkc0de/XortronCriminalComputingConfig-heretic is a 24 billion parameter language model with a 32K token context length. It is a decensored variant of the original darkc0de/XortronCriminalComputingConfig, created using the Heretic v1.1.0 tool. This modification process, termed "abliteration," specifically targets and reduces content refusal mechanisms.

Key Characteristics

  • Decensored Output: Engineered to provide uncensored responses, significantly reducing content refusals from 20/100 in the original model to 6/100.
  • Performance: Achieves a KL divergence of 0.0062 relative to the original model, indicating a close linguistic similarity while offering distinct behavioral changes.
  • UGI Leaderboard Topper: As of July 2025, this model leads the UGI Leaderboard for models under 70 billion parameters in both the UGI and W10 categories, highlighting its specialized performance in unrestricted content generation.

Use Cases

This model is designed for applications requiring unrestricted and uncensored language generation. It is particularly suited for use cases where standard LLMs might refuse to generate content due to safety or ethical guidelines. Users are advised to exercise responsibility and discretion due to its nature.