Overview
MS-24B-Mullein-v0 Overview
MS-24B-Mullein-v0 is a 24 billion parameter language model developed by trashpanda-org, featuring a 32,768 token context length. This model is specifically designed for nuanced character and scenario portrayal, making it particularly suitable for roleplay and interactive narrative applications. It is noted for its varied responses upon rerolling, a tendency towards NPC characterization, and accurate depiction of characters and scenarios.
Key Characteristics
- Character & Scenario Portrayal: Excels at maintaining accurate character and scenario details throughout interactions.
- Diverse Responses: Provides varied outputs across rerolls, contributing to dynamic and less repetitive interactions.
- NPC Characterization: Shows a predisposition for generating detailed and consistent non-player character (NPC) behaviors and dialogues.
- Low Positivity Bias: Exhibits little to no inherent positivity bias, allowing for a wider range of emotional and thematic expressions, including "unhinged" instances.
- Adherence to Structure: Demonstrates strong adherence to initial message structures and rare user impersonation.
Training Data Highlights
The model was trained on a diverse set of datasets, including:
- Allura's Sugarquill 10k for creative writing.
- Estrogen's floyd-instruct and woke-identity.
- Gryphe's Sonnet3.5 RP and 4o WP datasets, filtered for quality.
- Anthracite-org's kalo-opus-instruct-22k-no-refusal.
- Norquinal's OpenCAI and Dampfinchen's Creative Writing Multiturn.
- Recursal's SCP wiki dataset.
Intended Use Cases
This model is particularly well-suited for:
- Roleplaying: Generating dynamic and character-consistent responses in interactive roleplay scenarios.
- Creative Writing: Assisting with narrative generation, character development, and scenario building.
- Interactive Storytelling: Creating engaging and varied story paths based on user input.