Vortex5/Nova-Mythra-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jan 14, 2026Architecture:Transformer0.0K Cold
Nova-Mythra-12B by Vortex5 is a 12 billion parameter language model with a 32768-token context length, created through a multi-stage merge of several specialized models including Hollow-Aether-12B and KiloNovaSynth-12B. This merged architecture is specifically designed and optimized for creative applications such as storytelling, roleplay, and imaginative writing. It excels at generating long-form narratives and character-focused interactions, making it suitable for developers focused on generative text for creative content.
Loading preview...