0xA50C1A1/MN-12B-Nymphaea-RP
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MN-12B-Nymphaea-RP is a 12 billion parameter fine-tuned variant of Mistral Nemo Instruct 2407, developed by 0xA50C1A1. This model is specifically optimized for roleplay and creative writing tasks, leveraging a 32768-token context length. It was trained using DoRA on an expanded Darkmere dataset, featuring a mix of synthetic and human-written stories, and is notable for its uncensored nature due to pre-fine-tuning weight obliteration.

Loading preview...