Vortex5/Astral-Noctra-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jan 12, 2026Architecture:Transformer0.0K Cold

Astral-Noctra-12B by Vortex5 is a 12 billion parameter language model with a 32768 token context length, created by merging Hollow-Aether-12B, KiloNovaSynth-12B, Violet-Lyra-Gutenberg-v2, and Tlacuilo-12B. This model is specifically optimized for creative applications, excelling in storytelling, roleplay, and imaginative writing tasks. It leverages a custom smi_oni merge method to combine the strengths of its constituent models for enhanced narrative generation.

Loading preview...