DarkArtsForge/Magistaroth-24B-v1
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 22, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Magistaroth-24B-v1 by DarkArtsForge is a 24 billion parameter language model built on the MistralForCausalLM architecture, featuring a 32768 token context length. This model is a highly creative merge, utilizing the DELLA method, and is specifically optimized for generating detailed and creative narratives, including those with graphic or erotic content. It scores 14152 on Q0 Bench (Pass Q0G), indicating strong performance in its specialized domain.

Loading preview...