MuXodious/Hearthfire-24B-absolute-heresy

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jan 18, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MuXodious/Hearthfire-24B-absolute-heresy is a 24 billion parameter Hearthfire fine-tune, developed by P-E-W and Gryphe Padar, with a 32768 token context length. It is specifically designed for narrative longform writing, prioritizing atmosphere, introspection, and slow-burn scenes over rapid plot progression or high-stakes action. This model excels at generating immersive, cooperative narratives, making it suitable for creative writing applications where detailed environmental descriptions and character internal states are desired.

Loading preview...

Hearthfire-24B-absolute-heresy Overview

MuXodious/Hearthfire-24B-absolute-heresy is a 24 billion parameter model, fine-tuned from Hearthfire-24B using P-E-W's Heretic v1.1.0 engine. This particular iteration achieves an "Absolute Heresy" index, indicating low refusals (8/100) and KL Divergence (0.0670), though this classification is noted as arbitrary and not indicative of performance. The model is specifically designed for narrative longform writing, emphasizing atmosphere, introspection, and a deliberate pace.

Key Capabilities

  • Atmospheric Narrative Generation: Prioritizes "vibes over velocity," comfortable with silence and detailed scene-setting.
  • Introspection and Slow Burn: Excels at expanding on current states and character internal thoughts rather than forcing immediate dramatic consequences.
  • Cooperative Storytelling: Retains a warm, cooperative tone, inclined to build narratives collaboratively rather than being hostile or punishing.
  • High Agency: Designed to write in the user's stead, acting and speaking for characters to maintain narrative flow, preventing descriptive stagnation.
  • Long Context Writing: Trained on thousands of 8-16K context writing examples, supporting extended narrative blocks.

Good For

  • Creative Writing Applications: Ideal for generating immersive stories, roleplay, and interactive fiction where detailed descriptions and character depth are paramount.
  • Atmospheric Storytelling: Users seeking narratives that focus on mood, setting, and character development rather than constant action.
  • Collaborative Writing: Scenarios where the model is expected to actively contribute to the narrative, including character actions and dialogue.
  • Second-Person Present Tense Narratives: The model was specifically trained with a 'continue-heavy' structure using this tense, making it highly proficient.

Note on Tokenizers: Users should add both </s> (Mistral; Token ID 2) and <|im_end|> (ChatML; Token ID 999) EOS tokens to stop sequences for optimal performance. The model was trained using the ChatML prompt format.