Naphula/Boreas-24B-v1.1
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Dec 27, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Naphula/Boreas-24B-v1.1 is a 24 billion parameter Mistral-based language model merge, created using a custom 'rsce' method from 14 distinct Mistral 2501 finetunes. Optimized for grimdark narratives, ancient lore, and complex character interactions, it excels at creative writing with an emphasis on unyielding, creative, and intelligent storytelling. The model is specifically calibrated for roleplay (RP) mode, delivering precise and imaginative generations with a 32768 token context length.

Loading preview...