Vortex5/Stellar-Umbra-12B
Vortex5/Stellar-Umbra-12B is a 12 billion parameter language model created by merging Scarlet-Seraph-12B, KansenSakura-Erosion-RP-12B, and Rei-V3-KTO-12B using a custom method. This model is specifically optimized for creative applications such as storytelling, roleplay, and general creative writing tasks. With a context length of 32768 tokens, it is designed to handle extended narrative generation and complex character interactions.
Loading preview...
Stellar-Umbra-12B: A Merged Model for Creative Applications
Stellar-Umbra-12B is a 12 billion parameter language model developed by Vortex5. It was created through a unique merging process, combining three distinct base models: Scarlet-Seraph-12B, KansenSakura-Erosion-RP-12B, and Rei-V3-KTO-12B. This fusion was achieved using a custom amsf merge method, with Vortex5/Scarlet-Seraph-12B serving as the tokenizer source.
Key Capabilities
- Advanced Merging: Utilizes a custom
amsfmerge method to combine the strengths of multiple specialized models. - Extended Context: Features a context length of 32768 tokens, enabling the generation of longer, more coherent narratives and complex conversational flows.
Good For
Stellar-Umbra-12B is specifically intended for use cases requiring high-quality, imaginative text generation. Its primary strengths lie in:
- Storytelling: Crafting engaging and detailed narratives.
- Roleplay: Generating dynamic and consistent character interactions.
- Creative Writing: Assisting with various forms of creative content generation.