Cactus Dream Horror 12B Overview
Cactus Dream Horror 12B is a 12 billion parameter language model developed by EldritchLabs, built upon a MistralForCausalLM architecture. It was created using the DELLA merge method, combining a base model of p-e-w/Mistral-Nemo-Instruct-2407-heretic-noslop with fifteen other specialized 12B models. This merge process involved specific tokenizer patches to ensure stability and optimize for 8GB VRAM acceleration.
Key Capabilities
- Merged Architecture: Leverages the strengths of multiple diverse 12B models, including those focused on roleplay and instruction following, to create a versatile language model.
- Adaptable Censorship: While partially censored by default, the model is designed to be jailbroken or ablated, allowing for use cases that require uncensored outputs.
- Extended Context Length: Supports a context window of 32768 tokens, enabling processing of longer inputs and maintaining conversational coherence over extended interactions.
- ChatML Template: Optimized for the ChatML chat template, ensuring compatibility with common instruction-following frameworks.
Good for
- Experimental AI Development: Ideal for developers looking to explore the capabilities of merged models and fine-tune for specific, nuanced behaviors.
- Roleplay and Creative Writing: The inclusion of several roleplay-focused models in its merge suggests strong performance in generating creative and character-driven narratives.
- Customizable Content Generation: Suitable for applications where content moderation needs to be dynamically adjusted, from partially censored to uncensored outputs.
- Resource-Efficient Deployment: Patches applied during the merge process aim to optimize VRAM usage, potentially making it more accessible for deployment on systems with 8GB VRAM.