Vortex5/Cosmic-Night-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Vortex5/Cosmic-Night-12B is a 12 billion parameter language model with a 32768 token context length, created by merging seven distinct 12B models using a custom method. This model is specifically optimized for creative writing tasks, excelling in roleplay, long-form storytelling, and generating atmospheric fiction. Its unique merge configuration aims to enhance emotional interaction and narrative depth.

Loading preview...

Cosmic-Night-12B: A Merged Model for Creative Writing

Cosmic-Night-12B is a 12 billion parameter language model developed by Vortex5, featuring a substantial 32768 token context length. This model was constructed through a custom merging process, combining seven different 12B models: NoctyxCosma-12B, Rocinante-X-12B-v1, magnum-v4-12b, MN-Slush, Darklit-Maiden-12B, Nemo-12b-Humanize-SFT-v0.2.5-KTO, and MN-12b-RP-Ink-RP-Longform. The merge utilized a 'gema' method with specific parameters for strength, aggression, consensus amplification, novelty, and osmosis strength.

Key Capabilities

  • Roleplay: Designed for emotion-forward interactions, making it suitable for dynamic character engagement.
  • Storytelling: Optimized for generating long-form narratives with coherent plot development.
  • Creative Writing: Excels at producing atmospheric and immersive fiction.

Intended Use Cases

This model is particularly well-suited for applications requiring advanced creative text generation, including:

  • Developing interactive role-playing scenarios.
  • Assisting authors with long-form story creation and plot generation.
  • Generating descriptive and evocative fictional content.