Casual-Autopsy/Maginum-Cydoms-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Dec 10, 2025Architecture:Transformer0.0K Warm

Casual-Autopsy/Maginum-Cydoms-24B is a 24 billion parameter language model with a 32768 token context length, created by Casual-Autopsy through a sophisticated merge of multiple pre-trained models. This model leverages TIES, DELLA, and SLERP merge methods to combine the strengths of various Mistral-based 24B models, including those from anthracite-core, TheDrummer, and zerofata. It is designed to offer enhanced performance by integrating diverse model capabilities, making it suitable for general-purpose text generation and understanding tasks.

Loading preview...