Ancient Awakening 12B MPOA: Uncensored Narrative and Roleplay
Ancient-Awakening-12B-MPOA is a 12 billion parameter language model developed by Naphula, featuring an extensive 32768 token context length. This model is a highly complex merge of 70 different Mistral Nemo-based finetunes, meticulously combined through a nine-stage process utilizing advanced mergekit methods such as karcher, flux, arcee_fusion, ramplus_tl, and pdq.
Key Capabilities
- Uncensored Content Generation: The MPOA (Magnitude-Preserving Orthogonalized Ablation) technique has been applied to remove guardrails, enabling the generation of narratives and roleplay content that may include violent and graphic erotic material.
- Complex Merge Architecture: Built from a diverse array of 70 donor models, including specialized merges like Kraken Karcher and Riemannian Redshift, ensuring a broad and deep knowledge base.
- Extended Context Window: Supports a 32768 token context, facilitating long-form narrative and intricate roleplay scenarios.
- Optimized for Roleplay: The extensive merging of various finetunes, many of which are RP-focused, suggests strong capabilities in character consistency and dynamic storytelling.
Good for
- Creative Writing & Roleplay: Ideal for generating detailed, immersive, and unrestricted narratives, especially those requiring explicit or graphic content.
- Experimental AI Development: Useful for researchers and developers exploring the boundaries of uncensored language generation and complex model merging techniques.
- System Prompt Customization: Designed to work effectively with specific chat templates like ChatML or Mistral Tekken, allowing for tailored system prompts to guide its output.