limloop/MN-12B-Hydra-RP-RU
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 2, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The limloop/MN-12B-Hydra-RP-RU is a 12 billion parameter experimental TIES merge model based on Mistral Nemo 12B, featuring a 32768 token context length. Developed by limloop, it is specifically optimized for advanced roleplay capabilities and deep literary Russian language fluency, while also exhibiting uncensored behavior. This model excels at maintaining character consistency and narrative depth in Russian, making it suitable for creative writing and interactive storytelling applications where explicit content may be desired.

Loading preview...