Rotor_24B_V.1: A Roleplay-Optimized Merge Model
OddTheGreat/Rotor_24B_V.1 is a 24 billion parameter language model created through the merge of several Mistral-Small 24B finetunes. The primary objective behind its development was to produce a robust roleplay (RP) model, emphasizing a natural, non-mechanical prose style over typical LLM outputs. This model is designed to be smart, highly capable of following instructions, and creative, handling up to 12,000 tokens of context effectively.
Key Capabilities
- Advanced Roleplay: Engineered specifically for immersive RP, focusing on natural dialogue and character consistency.
- Instruction Following: Demonstrates excellent ability to adhere to given instructions.
- Creative Prose: Prioritizes a less "mechanical" writing style, enhancing narrative quality.
- Long Dialogue Management: Stable and proficient in maintaining coherence and context over extended conversations.
- Context Handling: Capable of managing a 12,000-token context window reliably, with a full context length of 32,768 tokens.
- Multilingual Support: Tested with positive results for both assistant and roleplay modes in Russian (Ru) and assistant mode in French.
- Balanced Content: Good at maintaining a balance between SFW and NSFW content as appropriate for the scenario.
Ideal Use Cases
- Text Adventures: Specifically optimized for interactive text-based games and storytelling.
- Creative Writing & Roleplay: Suitable for generating engaging narratives and character interactions.
- Multilingual Assistant: Can be used for general assistance in Russian and French, in addition to its primary RP focus.