RicardoEstep/AuroSlayerEtherealKrixUnslopMellRPMaxDARKNESS-12B

Cold
Public
12B
FP8
32768
1
Hugging Face

RicardoEstep/AuroSlayerEtherealKrixUnslopMellRPMaxDARKNESS-12B is a 12 billion parameter experimental multi-SLERP merge of three uncensored roleplay models, created by RicardoEstep. Utilizing Mergekit, it combines EtherealAurora, Krix, and AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS-v3, with a 32768 token context length. This model is specifically designed for generating uncensored roleplay content, though it currently has a known tokenizer bug that can lead to messy outputs.

Overview

Model Overview

RicardoEstep/AuroSlayerEtherealKrixUnslopMellRPMaxDARKNESS-12B is a 12 billion parameter experimental model created by RicardoEstep using Mergekit. It is an "Experimental Multi-SLERP Mix" combining three distinct models known for uncensored roleplay capabilities.

Key Characteristics

  • Merge Method: Utilizes a multislerp merge method, with EtherealAurora-12B-Lorablated as the base model (0.34 weight) and Krix-12B-Model_Stock (0.33 weight) and AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS-v3 (0.33 weight) contributing equally.
  • Context Length: Supports a context length of 32768 tokens.
  • Primary Focus: Designed for generating uncensored roleplay content.

Known Limitations

  • Tokenizer Bug: The model currently has a known tokenizer bug that can cause outputs to become "messy" when certain "Triggers" appear. The creator advises against using this model until this issue is resolved.

Quantized Versions

Quantized GGUF versions are available, including one with iMatrix optimization, provided by mradermacher: