MN-12B-LucidFaun-RP-RU: A Surgical Merge for Enhanced Roleplay and Storytelling
This 12 billion parameter model, developed by limloop, is a unique diagnostic SLERP merge of the Mistral Nemo-based Faun and lucid models. It addresses Faun's censorship by selectively replacing late MLP layers with lucid's weights, resulting in a model that retains Faun's distinct personality and instruction format while gaining lucid's stability and uncensored behavior.
Key Capabilities
- Near-total absence of censorship: Rare disclaimers only at very high temperatures, with generation continuing uninterrupted.
- Enhanced stability: Outperforms Faun at lower temperatures (≤0.5) and remains stable up to 0.8 with
top_k=20. - Robust storytelling: Inherits lucid's advanced features for scene planning, plot management, and character development.
- Tool calling: Fully supported, maintaining Faun's original capabilities.
- Multilingual support: Proficient in both Russian and English.
- Extended context: Stable operation verified up to 8192 tokens.
Good For
- Immersive roleplay scenarios: Combines Faun's lively character with lucid's narrative depth.
- Creative writing and detailed story generation: Leverages lucid's strengths in crafting coherent and engaging narratives.
- Applications requiring uncensored responses: Ideal for use cases where content filtering is undesirable.
- Developers seeking a stable model for long-context interactions: Proven stability up to 8K tokens.