Vortex5/Wicked-Nebula-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 20, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Vortex5/Wicked-Nebula-12B is a 12 billion parameter language model created by Vortex5 through a custom merge of five distinct 12B models, including Hollow-Aether-12B and Rocinante-X-12B-v1. Designed with a 32768 token context length, this model is specifically optimized for creative writing tasks such as structured long-form storytelling, emotion-forward roleplay, and atmospheric fiction generation. Its unique merging approach aims to enhance narrative coherence and expressive capabilities.

Loading preview...

Wicked-Nebula-12B Overview

Wicked-Nebula-12B is a 12 billion parameter language model developed by Vortex5. It was created using a custom smcos merge method, combining five specialized 12B models: Hollow-Aether-12B, MN-12b-RP-Ink-RP-Longform, Lunar-Twilight-12B, Astral-Noctra-12B, and Rocinante-X-12B-v1. This unique merging strategy aims to leverage the strengths of its constituent models to excel in specific creative applications.

Key Capabilities

  • Storytelling: Designed for generating structured, long-form narratives.
  • Roleplay: Optimized for emotion-forward and interactive roleplaying scenarios.
  • Creative Writing: Excels at producing atmospheric and engaging fiction.

Good for

  • Developers and writers focused on generating detailed stories and plots.
  • Applications requiring nuanced character interactions and emotional depth in text.
  • Projects that benefit from rich, descriptive, and imaginative content generation.