USS-Inferprise/Dark-Cydonian-Wind-24B

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 17, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold

Dark Cydonian Wind 24B, developed by USS-Inferprise, is a 24 billion parameter language model created through a TIES merge of three Mistral-Small derivatives: Cydonia-24B-v4.3, The-Omega-Darker-The-Final-Directive-24B, and Redemption Wind. This model is optimized for creative writing, offering a balance of narrative quality and a distinct 'spiciness' for crafting complex characters like villains, while maintaining stability and reducing 'slop' from its parent models. It supports large context windows of 48K-64K tokens and is designed to fit on 24GB VRAM cards, or 16GB VRAM cards with a reduced context.

Loading preview...

Dark Cydonian Wind 24B Overview

Dark Cydonian Wind 24B is a 24 billion parameter language model developed by USS-Inferprise, created using a TIES merge of three distinct Mistral-Small derivatives: Cydonia-24B-v4.3, The-Omega-Darker-The-Final-Directive-24B, and Redemption Wind. This unique merging strategy balances the narrative quality of Cydonia and Redemption Wind with the 'spiciness' introduced by Omega Darker, resulting in a model well-suited for nuanced creative writing.

Key Capabilities

  • Optimized for Creative Writing: Excels at generating engaging narratives, particularly for crafting complex characters with a distinct edge, such as villains, without veering into overly explicit content.
  • Large Context Support: Designed to handle extensive inputs, supporting context windows ranging from 48K to 64K tokens.
  • Efficient Resource Utilization: The 24B architecture is optimized to run effectively on 24GB VRAM graphics cards, and can also operate on 16GB VRAM cards with a reduced context window.
  • Reduced 'Slop': The TIES merge process has significantly mitigated the 'sloppiness' observed in some of its parent models, leading to more coherent and stable outputs.

Good For

  • Creative Writing & Roleplay: Ideal for authors and developers looking to generate rich, character-driven narratives, especially those involving morally ambiguous or villainous characters.
  • Applications Requiring Nuanced Tone: Suitable for scenarios where a balance between engaging storytelling and a subtle 'edge' is desired, avoiding overly sanitized or excessively depraved outputs.
  • Memory-Intensive Tasks: Benefits from its large context window, making it suitable for applications requiring the model to retain and process extensive information over long interactions.