Vortex5/Nova-Mythra-12B
Nova-Mythra-12B by Vortex5 is a 12 billion parameter language model with a 32768-token context length, created through a multi-stage merge of several specialized models including Hollow-Aether-12B and KiloNovaSynth-12B. This merged architecture is specifically designed and optimized for creative applications such as storytelling, roleplay, and imaginative writing. It excels at generating long-form narratives and character-focused interactions, making it suitable for developers focused on generative text for creative content.
Loading preview...
Overview
Nova-Mythra-12B is a 12 billion parameter language model developed by Vortex5, featuring a substantial 32768-token context length. It was engineered using a unique multi-stage merging process, combining six distinct base models: Hollow-Aether-12B, KiloNovaSynth-12B, NoctyxCosma-12B, Violet-Lyra-Gutenberg-v2, Lunar-Twilight-12B, and Tlacuilo-12B. This intricate merging strategy aims to consolidate the strengths of its constituent models.
Key Capabilities
- Storytelling: Designed to generate long-form narrative worlds and complex plots.
- Roleplay: Optimized for character-focused interactions and dynamic role-playing scenarios.
- Creative Writing: Facilitates the generation of ideas, drafts, and scenes for various creative projects.
Intended Use
Nova-Mythra-12B is specifically tailored for applications requiring high-quality generative text in creative domains. Its merged architecture makes it particularly adept at:
- Assisting writers with drafting and expanding stories.
- Powering interactive fiction and role-playing games.
- Generating imaginative content and creative prompts.