darkc0de/BlackXorDolphTronGOAT-heretic

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 3, 2026License:wtfplArchitecture:Transformer Cold

darkc0de/BlackXorDolphTronGOAT-heretic is a 24 billion parameter decensored language model, derived from darkc0de/BlackXorDolphTronGOAT using the Heretic v1.1.0 tool. This model is an experimental merge of multiple models using the arcee_fusion method, specifically engineered to reduce refusal rates compared to its original counterpart. It features a 32768 token context length and is optimized for use cases requiring less restrictive content generation.

Loading preview...

Overview

darkc0de/BlackXorDolphTronGOAT-heretic is a 24 billion parameter language model created by darkc0de. It is a decensored version of the original darkc0de/BlackXorDolphTronGOAT model, achieved through the application of the Heretic v1.1.0 tool. This model represents a personal experiment in stacking and shuffling multiple models using the mergekit with the arcee_fusion method.

Key Capabilities & Performance

  • Decensored Output: Engineered to provide less restrictive content generation, demonstrating a significant reduction in refusal rates.
  • Reduced Refusals: Achieves a refusal rate of 6/100, a notable improvement compared to the original model's 16/100 refusals.
  • Experimental Merge: Utilizes a unique arcee_fusion merge method, indicating a focus on exploring novel model merging techniques.

When to Use This Model

  • Less Restrictive Content Generation: Ideal for applications where the original model's refusal rates were a limiting factor.
  • Experimental AI Development: Suitable for researchers and developers interested in exploring the effects of decensoring techniques and advanced model merging strategies.