DS-Archive/ds-brew-13b

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:agpl-3.0Architecture:Transformer0.0K Open Weights Cold

DS-Archive/ds-brew-13b is a 13 billion parameter language model based on the Llama 2 architecture, created by DS-Archive. This model is a SLERP merge of several Llama 2-based models, including Airoboros, Chronos, Nous-Hermes, LIMARP v2, and Pygmalion-2. It is specifically optimized for roleplaying scenarios, leveraging its merged components to generate character-driven responses. The model is designed for interactive narrative and conversational roleplay, with a context length of 4096 tokens.

Loading preview...

DS-Archive/ds-brew-13b: A Llama 2-based Roleplaying Merge

DS-Archive/ds-brew-13b is a 13 billion parameter language model built upon the Llama 2 architecture. It is a unique creation, resulting from a SLERP (Spherical Linear Interpolation) merge of five distinct Llama 2-based models:

Key Capabilities & Differentiators

This model's primary strength lies in its specialization for roleplaying. By merging models known for their conversational and narrative generation abilities, ds-brew-13b is designed to excel at:

  • Character-driven interactions: Generating responses that adhere to specified character personas.
  • Scenario-based storytelling: Engaging in dynamic narratives based on user-defined scenarios.
  • Interactive chat: Facilitating roleplaying conversations where the model acts as a specific character.

Usage and Limitations

While various prompt formats may work due to its merged nature, the model is explicitly tested with an Alpaca-style instruction format, similar to LIMARP v2, which includes sections for Character's Persona, User's Persona, Scenario, and Input. Users should be aware that the model may exhibit biases present in niche roleplaying communities, in addition to those inherited from its base Llama 2 architecture. It is not intended for factual information retrieval or providing advice.