Naphula-Archives/Acid2501-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 6, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Acid2501-24B by Naphula-Archives is a 24 billion parameter language model built on the MistralForCausalLM architecture, specifically designed as a '2501 only test' for Goetia. This model is a merge of several Mistral-based models, including Mistral-Small-24B-Instruct-2501, Arcee-Blitz, ArliAI-RPMax-v1.4, Dolphin-Mistral-24B-Venice-Edition, Dans-DangerousWinds-V1.1.1-24b, ReadyArt's Broken-Tutu variants, Cydonia-24B-v2, MS-24B-Instruct-Mullein-v0, BlackSheep-24B, and MistralThinker-v1.1. It features a 32768 token context length and is primarily intended for specific experimental evaluations within the Goetia framework, rather than general-purpose applications.

Loading preview...