limloop/MN-12B-Faun-RP-RU

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 18, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MN-12B-Faun-RP-RU by limloop is a 12 billion parameter TIES-merged model based on Mistral Nemo, featuring a 32768-token context length. It is specifically optimized for high-quality, consistent roleplay and enhanced fluency in Russian, including an expanded vocabulary for diverse and NSFW topics. This model maintains a mostly uncensored character, making it suitable for applications requiring expressive and less restricted text generation.

Loading preview...

Overview

MN-12B-Faun-RP-RU is a 12 billion parameter TIES-merged model developed by limloop, building upon the Mistral Nemo 12B architecture. It evolves the concepts of Hydra-style merges, with a primary focus on significantly improving Russian language capabilities and roleplay quality. The model is designed for stable performance across long contexts, tested up to approximately 8192 tokens.

Key Capabilities

  • Enhanced Russian Language: Offers improved fluency and a richer, more varied vocabulary, including complex and NSFW domains.
  • Consistent Roleplay: Delivers more stable and immersive character portrayal and dialogue style.
  • Mostly Uncensored: Provides less restricted content generation, though it may occasionally add disclaimers for sensitive prompts without blocking output.
  • Strong Instruction Following: Adheres well to user instructions.
  • Stable Long Contexts: Maintains performance and consistency up to ~8K tokens.

Good for

  • Applications requiring high-quality, expressive roleplay in Russian.
  • Generating text with an expanded vocabulary, including sensitive or niche topics.
  • Use cases where a less censored model is preferred, with an understanding of occasional disclaimers.
  • Scenarios demanding stable performance over moderately long conversational contexts.