coder3101/Cydonia-24B-v4.3-heretic

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Dec 20, 2025Architecture:Transformer0.0K Cold

The coder3101/Cydonia-24B-v4.3-heretic is a 24 billion parameter language model, derived from TheDrummer/Cydonia-24B-v4.3 and processed with Heretic v1.1.0 to reduce refusals. This model is specifically optimized for creative writing, dynamic storytelling, and nuanced roleplay, aiming to enhance user experience in entertainment-focused applications. It features a 32768 token context length and demonstrates significantly lower refusal rates compared to its original counterpart, making it suitable for diverse creative tasks.

Loading preview...

Model Overview

coder3101/Cydonia-24B-v4.3-heretic is a 24 billion parameter language model, a 'decensored' version of TheDrummer/Cydonia-24B-v4.3, created using Heretic v1.1.0. This modification significantly reduces the model's refusal rate from 73/100 to 5/100, as measured by KL divergence of 0.0447. The model maintains a substantial context length of 32768 tokens.

Key Capabilities

  • Enhanced Roleplay: Offers much better and more dynamic roleplay experiences, with characters feeling unique and consistent.
  • Creative Writing: Excels in prose, storytelling, and generating compelling narratives, often introducing relevant unmentioned elements naturally.
  • Reduced Refusals: Engineered to be less restrictive, allowing for broader creative exploration without frequent content refusals.
  • Nuance and Adherence: Demonstrates good adherence to instructions and handles nuance effectively, even in complex group chat scenarios with multiple distinct characters.

Good For

  • Creative Writing Applications: Ideal for generating stories, prose, and dynamic narratives.
  • Roleplaying Scenarios: Particularly strong for interactive roleplay, character development, and consistent character portrayal.
  • Entertainment-focused AI: Suited for use cases where creative freedom and reduced alignment restrictions are desired.
  • Long-Context Interactions: Capable of remembering details and maintaining coherence over extended conversations, even up to 20k context.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p