Vortex5/Astral-Noctra-12B
Astral-Noctra-12B by Vortex5 is a 12 billion parameter language model with a 32768 token context length, created by merging Hollow-Aether-12B, KiloNovaSynth-12B, Violet-Lyra-Gutenberg-v2, and Tlacuilo-12B. This model is specifically optimized for creative applications, excelling in storytelling, roleplay, and imaginative writing tasks. It leverages a custom smi_oni merge method to combine the strengths of its constituent models for enhanced narrative generation.
Loading preview...
Overview
Vortex5's Astral-Noctra-12B is a 12 billion parameter language model designed for creative text generation, featuring a 32768 token context length. It was developed by merging four distinct models: Hollow-Aether-12B, KiloNovaSynth-12B, Violet-Lyra-Gutenberg-v2, and Tlacuilo-12B. This merge was performed using a custom smi_oni method, which combines the strengths of its base models to create a specialized tool for imaginative tasks.
Key Capabilities
- Storytelling: Generates long-form narrative worlds and complex plots.
- Roleplay: Facilitates character-focused interactions and dynamic roleplaying scenarios.
- Creative Writing: Assists in developing ideas, drafting content, and crafting detailed scenes.
Intended Use
Astral-Noctra-12B is specifically tailored for applications requiring high-quality, imaginative text. It is an ideal choice for developers and users focused on:
- Generating engaging stories and fictional narratives.
- Creating interactive roleplaying experiences.
- Brainstorming and drafting creative content across various genres.