Naphula/Slimaki-24B-v1.2
Naphula/Slimaki-24B-v1.2 is a 24 billion parameter language model based on the Mistral architecture, created by Naphula through a merge of several pre-trained models using the DELLA method. This model is specifically designed for creative text generation and roleplay, with an emphasis on producing graphic and erotic content. It features zero refusals and enhanced creativity, making it suitable for niche applications requiring unconstrained narrative output.
Loading preview...
Ślimaki-24B-v1.2 Overview
Naphula/Slimaki-24B-v1.2 is a 24 billion parameter language model built upon the Mistral architecture. It is a merge of multiple pre-trained models, created using the DELLA merge method, and was inspired by Casual-Autopsy/Maginum-Cydoms-24B.
Key Characteristics
- Zero Refusals: The model is explicitly designed to have no content refusals, allowing for unconstrained output.
- Enhanced Creativity: Version 1.2 aims to be more creative than its predecessor, with additional "spice injection" for richer narratives.
- Content Generation: Capable of producing narratives and roleplay content that includes violent and graphic erotic themes.
- Mistral Tekken Chat Template: Users are advised to utilize the Mistral Tekken chat template for optimal performance and to adjust system prompts for desired content.
Merge Details
The model was constructed using mergekit and incorporates several base models, including anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only, TheDrummer/Cydonia-24B-v4.3, ReadyArt/4.2.0-Broken-Tutu-24b, and others. A custom modification to the sparsify.py script was implemented to auto-shrink epsilon values, ensuring the DELLA merge process operates within valid bounds.
Use Cases
This model is particularly suited for applications requiring highly creative, unmoderated, and potentially graphic or erotic narrative generation and roleplay scenarios. Developers should be aware of its explicit content capabilities and adjust system prompts accordingly.