Overview
Goetia 24B v1.2: A Specialized Merge for Narrative and Roleplay
Goetia 24B v1.2, developed by Naphula, is a substantial merge of eighteen fine-tuned language models, primarily based on Mistral-Small-3.1/3.2 architectures. Utilizing the Karcher merge method, this 24 billion parameter model with a 32768 token context length is a significant upgrade from its predecessor, v1.1, focusing exclusively on finetunes as donors to minimize vector distortion and enhance PCA manifold accuracy.
Key Capabilities
- Advanced Narrative Generation: Optimized for creating detailed narratives and engaging roleplay scenarios.
- Explicit Content Generation: Capable of producing violent and graphic erotic content, requiring careful system prompt adjustments and the use of the Mistral Tekken chat template.
- Robust Merge Architecture: Built with the Karcher method for stability, combining numerous finetuned models without using other merges as donors.
- Cthulhu v1.4 Checkpoint: Serves as a foundational checkpoint for the planned Cthulhu v1.4, which aims for uncensored finetuning on H.P. Lovecraft datasets.
- Uncensored Variant (Qliphoth v1.2): An ablation of Goetia v1.2, Qliphoth v1.2, offers no refusals, providing an alternative for users requiring completely unrestricted output.
Good for
- Developers and users seeking a powerful model for creative writing, storytelling, and complex roleplay, especially those involving mature or explicit themes.
- Experimentation with advanced merge architectures and understanding the impact of specific finetunes on model behavior.
- As a base for further finetuning projects, particularly for uncensored content generation or specialized thematic datasets like Lovecraftian horror.