Vortex5/NoctyxCosma-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Dec 15, 2025Architecture:Transformer0.0K Cold

NoctyxCosma-12B is a 12 billion parameter language model developed by Vortex5, created by merging six distinct 12B models using a custom smi_oni merge method. This model is specifically designed and optimized for creative writing, storytelling, and roleplay applications. Its unique merging approach aims to combine the strengths of its constituent models to excel in generative text tasks requiring imagination and narrative coherence.

Loading preview...

NoctyxCosma-12B: A Merged Model for Creative Generation

NoctyxCosma-12B is a 12 billion parameter language model developed by Vortex5. It was constructed through a custom smi_oni merge method, combining six individual 12B models: Harmony-Bird-12B, Chrysologus-12B, Rei-V3-KTO-12B, Scarlet-Eclipse-12B, Lunar-Nexus-12B, and Red-Synthesis-12B. This unique merging strategy aims to leverage the diverse capabilities of its constituent models.

Key Capabilities

  • Advanced Model Merging: Utilizes a specialized smi_oni merge method with fine-tuned parameters like k_core, strength_core, consensus_core, drop_cos, drop_min, strength_nov, novelty_budget, consensus_nov, and conflict_bonus to create a synergistic model.
  • Bfloat16 Precision: The model operates using bfloat16 data type for efficient computation.
  • Shared Tokenizer: Employs the tokenizer from Vortex5/Red-Synthesis-12B for consistent text processing.

Intended Use Cases

NoctyxCosma-12B is specifically designed and optimized for generative tasks that require imaginative and coherent text output. Its primary applications include:

  • Creative Writing: Generating original stories, poems, or other literary content.
  • Storytelling: Assisting in the development of narratives, plotlines, and character arcs.
  • Roleplay: Facilitating interactive and dynamic role-playing scenarios with nuanced responses.