Aratako/NemoAurora-RP-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 7, 2025License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Aratako/NemoAurora-RP-12B is a 12 billion parameter instruction-tuned language model developed by Aratako, built upon the Mistral-Nemo-Instruct-2407 base. This model is specifically enhanced for role-playing scenarios through a merge of multiple international models. It is optimized for generating character-driven dialogues and narratives, supporting a context length of 32768 tokens.

Loading preview...