m-polignano/ANITA-NEXT-24B-Dolphin-Mistral-UNCENSORED-ITA
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jul 25, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ANITA-NEXT-24B-Dolphin-Mistral-UNCENSORED-ITA by m-polignano is a 24 billion parameter multilingual (English and Italian) large language model built on the Mistral architecture, fine-tuned from dphn/Dolphin-Mistral-24B-Venice-Edition. This model is specifically designed as an uncensored 'Thinking Model' within the ANITA family, offering a 32768 token context length. It is optimized for research purposes requiring a model with fewer ethical and safety constraints.

Loading preview...