coder3101/Magidonia-24B-v4.3-heretic-v2
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Dec 20, 2025Architecture:Transformer0.0K Cold

Magidonia-24B-v4.3-heretic-v2 by coder3101 is a 24 billion parameter, 32K context length language model, derived from TheDrummer's Magidonia-24B-v4.3 and decensored using Heretic v1.1.0. This model is specifically optimized for creative writing, roleplay, and entertainment, aiming to enhance user experience in scenarios requiring no alignment. It demonstrates significantly reduced refusals compared to its original counterpart, making it suitable for dynamic and imaginative applications.

Loading preview...