coder3101/Cydonia-24B-v4.3-heretic-v2

Warm
Public
24B
FP8
32768
Hugging Face
Overview

Model Overview

coder3101/Cydonia-24B-v4.3-heretic-v2 is a 24 billion parameter language model, a decensored variant of TheDrummer/Cydonia-24B-v4.3. This version was created using the Heretic v1.1.0 tool, specifically to reduce alignment and enhance creative freedom. It maintains a substantial 32768 token context length, allowing for extended and coherent interactions.

Key Differentiators

This model stands out due to its focus on creative and unaligned applications:

  • Decensored Output: Achieves a significantly lower refusal rate (22/100) compared to the original model (73/100), enabling more open-ended and less constrained responses.
  • Enhanced Creativity & Roleplay: Fine-tuned to excel in creative writing, dynamic storytelling, and nuanced roleplay, providing compelling and imaginative narratives.
  • Initiative in Storytelling: Demonstrates the ability to introduce relevant, unmentioned elements into a story, contributing to a more natural and engaging narrative flow.
  • Character Consistency: Capable of maintaining consistent character voices and details, even in multi-character group chats and high context lengths.

Ideal Use Cases

This model is particularly well-suited for scenarios requiring:

  • Creative Writing & Story Generation: For users seeking a model that can generate imaginative prose, plot twists, and engaging narratives.
  • Roleplay & Interactive Fiction: Excels in maintaining character consistency and dynamic interactions within roleplaying scenarios.
  • Unrestricted Content Generation: When the goal is to explore themes and topics without the typical alignment constraints found in many foundational models.
  • Entertainment Applications: Designed to provide an enjoyable and flexible experience for various entertainment-focused AI interactions.