AshhLimaRP-Mistral-7B: Longform Roleplay Specialist
AshhLimaRP-Mistral-7B is a 7 billion parameter model developed by lemonilia, specifically fine-tuned for longform, novel-style 1-on-1 roleplaying chat. It is built upon the Ashhwriter-Mistral-7B base, which was originally trained on human-written lewd stories, and further refined with 2000 training samples up to approximately 9k tokens in length. This model is designed to replicate the experience of internet forum-style roleplay, explicitly not supporting short-form, IRC/Discord-style RP.
Key Capabilities
- Longform Roleplay: Excels at generating detailed, narrative-driven responses suitable for extended roleplaying scenarios.
- Customizable Response Length: Features a unique message length control mechanism, allowing users to specify desired response lengths (e.g.,
micro, medium, massive) directly in the prompt, influencing bot output. - Persona and Scenario Integration: Designed to incorporate character personas and scenario data for immersive roleplay.
- Extended Alpaca Prompt Format: Utilizes an adapted Alpaca format for multi-turn conversations, including
### Instruction:, ### Input:, and ### Response: tags.
Good for
- Developers and users seeking a model specialized in detailed, multi-turn, novel-style roleplaying.
- Applications requiring fine-grained control over response length in conversational AI.
- Creating immersive and character-driven interactive narratives.
- Use cases where the base model's training on human-written lewd stories aligns with content requirements.