MN-12B-Nymphaea-RP: Roleplay and Creative Writing Specialist
MN-12B-Nymphaea-RP is a 12 billion parameter language model, fine-tuned by 0xA50C1A1 from the Mistral Nemo Instruct 2407 base. It is specifically designed and optimized for roleplay and creative writing, offering a 32768-token context window.
Key Capabilities & Features
- Specialized for Roleplay: Primarily developed for generating engaging and nuanced roleplay scenarios and creative narratives.
- Uncensored Output: The model's base weights were processed with Heretic prior to fine-tuning, resulting in an uncensored output capability.
- Advanced Training Method: Utilizes DoRA (Weight-Decomposed LoRA) with specific hyperparameters (LoRA Rank: 64, LoRA Alpha: 64, LoRA Dropout: 0.05) for efficient and effective fine-tuning.
- Expanded Dataset: Trained on the latest iteration of the Darkmere dataset, which includes a diverse mix of manually curated synthetic and human-written stories, enhancing its creative range.
- Mistral V3-Tekken Template: Recommended for custom presets, ensuring optimal performance with its intended instruct template.
Ideal Use Cases
- Interactive Storytelling: Generating dynamic and immersive story arcs and character interactions.
- Character Development: Crafting detailed and consistent character personas for roleplaying.
- Creative Content Generation: Assisting with various forms of creative writing, from short stories to complex narratives.
This model is particularly suited for developers and users looking for a powerful, uncensored, and highly specialized tool for generating high-quality, creative, and roleplay-centric text.