Vortex5/Mystic-Matron-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Dec 30, 2025Architecture:Transformer0.0K Cold

Mystic-Matron-12B is a 12 billion parameter language model created by Vortex5, formed by merging Scarlet-Seraph-12B, Mahou-1.5-Mistral-Nemo-12B, and Chaos-Unknown-12B. This model is specifically designed and optimized for creative applications such as storytelling, roleplay, and general creative writing tasks. It leverages a custom merge method to enhance its generative capabilities in these specialized domains, offering a 32768 token context length.

Loading preview...

Mystic-Matron-12B Overview

Mystic-Matron-12B is a 12 billion parameter language model developed by Vortex5. It was created through a custom merging process combining three distinct models: Scarlet-Seraph-12B, Mahou-1.5-Mistral-Nemo-12B, and Chaos-Unknown-12B. This unique fusion aims to leverage the strengths of its constituent models to excel in specific generative tasks.

Key Capabilities

  • Specialized Merging: Utilizes a custom cdrf merge method with specific parameters (strength: 0.92, route: 0.42, tau: 3.5, agree: 0.52) to achieve its distinct characteristics.
  • Creative Generation: Optimized for generating engaging and coherent content in creative writing contexts.
  • Extended Context: Supports a context length of 32768 tokens, allowing for more extensive and detailed creative outputs.

Good For

  • Storytelling: Generating narratives, plotlines, and character dialogues.
  • Roleplay: Creating dynamic and immersive roleplaying scenarios and responses.
  • Creative Writing: Assisting with various forms of creative text generation, from poetry to descriptive passages.