Overview
DS-Archive/ds-smol-brew-7b is a 7 billion parameter language model built upon the Llama 2 architecture. It is a composite model, created through a SLERP (Spherical Linear Interpolation) merge of five distinct models: jondurbin/spicyboros-7b-2.2, stabilityai/StableBeluga-7B, NousResearch/Nous-Hermes-llama-2-7b, lemonilia/limarp-llama2-v2, and PygmalionAI/pygmalion-2-7b. This merging strategy aims to combine the strengths of its constituent models.
Key Capabilities
- Roleplaying Chat: The model is specifically designed and optimized for engaging in character-driven roleplaying conversations.
- Diverse Persona Generation: By integrating models known for their conversational and creative text generation, it can adopt and maintain various character personas.
Usage and Limitations
While various prompt formats may work, the model is noted to respond well to the Alpaca instruction format, particularly the LIMARP v2 style, which includes explicit sections for character persona, user persona, scenario, and user input. It is important to note that due to its origins and training, the model may exhibit biases similar to those found in niche online roleplaying communities. It is not intended for providing factual information or advice of any kind. Users should refer to the individual merged models' repositories for detailed training information.