Vortex5/Crimson-Twilight-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 25, 2025Architecture:Transformer0.0K Cold

Crimson-Twilight-12B is a 12 billion parameter multistage merged language model developed by Vortex5, specifically designed and optimized for narrative roleplay. Built using MergeKit, this model combines several base models through a series of 'nearswap', 'slerp', and 'Karcher mean' merge methods to enhance its roleplaying capabilities. It offers a 32768 token context length, making it suitable for extended conversational and story-generation tasks.

Loading preview...

Overview

Crimson-Twilight-12B is a 12 billion parameter language model created by Vortex5, specifically engineered for narrative roleplay. This model is the result of a sophisticated multistage merging process utilizing MergeKit, combining several specialized base models to achieve its targeted performance.

Merge Process

The model's unique capabilities stem from its intricate merging architecture:

  • Step 1: Abyssal-Seraph-12B is merged with Lunar-Abyss-12B using the nearswap method.
  • Step 2: Moonlit-Shadow-12B is merged with Luminous-Shadow-12B via the slerp method.
  • Final Merge: The two intermediate models from the previous steps are then combined using the Karcher mean method, resulting in the final Crimson-Twilight-12B model.

Key Characteristics

  • Parameter Count: 12 billion parameters.
  • Context Length: Supports a context window of 32768 tokens.
  • Primary Use Case: Optimized for generating engaging and coherent narrative roleplay scenarios and responses.

When to Use This Model

This model is particularly well-suited for applications requiring:

  • Immersive Roleplaying: Generating detailed character interactions and story progression.
  • Creative Writing: Assisting with long-form narrative generation and descriptive text.
  • Extended Conversations: Leveraging its substantial context length for maintaining coherence over lengthy dialogues.