coder3101/Magidonia-24B-v4.3-heretic

Warm
Public
24B
FP8
32768
Hugging Face
Overview

Model Overview

coder3101/Magidonia-24B-v4.3-heretic is a 24 billion parameter language model derived from TheDrummer's Magidonia-24B-v4.3, specifically modified for decensored output using the Heretic v1.1.0 tool. This version prioritizes creativity, usability, and entertainment, distinguishing itself from models optimized for strict alignment or problem-solving.

Key Differentiators & Capabilities

  • Decensored Output: Achieves a refusal rate of 5/100, a significant reduction from the original model's 73/100, enabling more unconstrained and diverse generations.
  • Enhanced Creativity: Focuses on writing quality, dynamism in storytelling, and imaginative responses, aiming for a "writer-like" feel.
  • Flexible Alignment: Designed for use cases that do not require strict alignment, offering a broader range of expression.
  • Instruction Adherence: Capable of following instructions and understanding nuances in prompts, including specific thinking prefill formats like <thinking> </thinking>.
  • Roleplay Optimization: Users report much better roleplay capabilities and strong adherence to character cards.

Ideal Use Cases

  • Creative Writing: Generating stories, scripts, or other imaginative content.
  • Roleplay (RP): Engaging in dynamic and character-consistent roleplaying scenarios.
  • Choose Your Own Adventure (CYOA): Creating interactive narrative experiences.
  • Dungeon/Chat/Companion: Applications requiring versatile and engaging conversational AI.
  • Authoring Tools: Assisting with content generation where creative freedom is paramount.