Overview
Retreatcost/Shisa-K-sakurization: An Experimental Roleplaying Merge
Retreatcost/Shisa-K-sakurization is a 12 billion parameter language model created through an experimental merge using mergekit. This model specifically targets enhanced roleplaying capabilities by integrating a LoRa adapter derived from PocketDoc/Dans-SakuraKaze-V1.0.0-12b into a Shisa-K-12B base.
Key Capabilities
- Enhanced Roleplaying: The primary focus of this merge is to significantly improve the model's ability to engage in and generate rich, detailed roleplay scenarios.
- Extended Context: With a 32768 token context length, the model can maintain longer, more complex narratives and character interactions.
- ChatML Format: The model is designed to be used with the ChatML format, facilitating structured conversational inputs.
- Japanese Symbol Output: Users may occasionally encounter Japanese symbols in the output, a characteristic that can potentially be mitigated by adjusting
TOP_Pto 90 andMIN_Pto 0.1.
Merge Details
The model was constructed using the Linear merge method, with ./retokenized_SHK as the base and a LoRa adapter (./lora_Dans-SakuraKaze-V1.0.0-12b-64d) applied with a weight of 1.0. The tokenizer source is Retreatcost/KansenSakura-Radiance-RP-12b.
Good For
- Developers and users seeking a specialized model for creative and immersive roleplaying applications.
- Scenarios requiring deep character engagement and extended narrative coherence.
- Experimentation with merged models for specific domain enhancements.