Vortex5/MS3.2-24B-Penumbra-Aether

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jan 3, 2026Architecture:Transformer0.0K Cold

Vortex5/MS3.2-24B-Penumbra-Aether is a 24 billion parameter language model created by Vortex5 through a custom merge of MS3.2-24B-Chaos-Skies, Cydonia-24B-v4.3, Hearthfire-24B, and ms3.2-24b-longform. This model is specifically optimized for creative text generation, excelling in storytelling, roleplay, and general creative writing tasks. It features a context length of 32768 tokens, making it suitable for extended narrative generation.

Loading preview...

Overview

MS3.2-24B-Penumbra-Aether is a 24 billion parameter language model developed by Vortex5. It was created using a custom merging method, combining several base models: MS3.2-24B-Chaos-Skies, Cydonia-24B-v4.3, Hearthfire-24B, and ms3.2-24b-longform. The merge configuration utilized specific parameters for strength, flavor, paradox, cube dimensions, steps, and boost, and was processed using bfloat16 data type.

Key Capabilities

  • Model Merging: Built from multiple specialized models to combine their strengths.
  • Custom Merge Method: Utilizes a unique hpq merge method with fine-tuned parameters.
  • Extended Context: Supports a context length of 32768 tokens, beneficial for longer interactions and narratives.

Good For

  • Storytelling: Designed to generate coherent and engaging narratives.
  • Roleplay: Optimized for interactive and character-driven conversational scenarios.
  • Creative Writing: Excels in various forms of creative text generation, including prose and imaginative content.