TigerKay/magidonia-24b-lumia-cot
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

TigerKay's Magidonia-24B-Lumia-CoT is a 24 billion parameter causal language model, fine-tuned using SaRA (Sparse Retraining Architecture) from TheDrummer's Magidonia-24B-v4.3. This model is specifically designed to produce detailed chain-of-thought reasoning before generating roleplay responses, following a 12-step framework. It excels at structured reasoning for conversational AI, particularly in roleplaying scenarios, and supports a 32K context length.

Loading preview...