Warlock 7B v3: The Grand Amalgamation
Warlock 7B v3, developed by Naphula, is a 7 billion parameter model built upon the Mistral architecture, representing a significant advancement in model merging for creative applications. This iteration is a 'model soup' of 32 distinct Mistral 7B finetunes, meticulously selected and merged using the Karcher method to enhance roleplay and creative writing capabilities. It offers both a censored version with refusals and an uncensored version, which utilizes Magnitude-Preserving Orthogonalized Ablation (MPOA) to remove safety filters, allowing for uninhibited storytelling.
Key Capabilities
- Advanced Roleplay & Creative Writing: Merges strengths from finetunes specialized in storytelling, logic, and unbound generation.
- Uncensored Output: An uncensored version is available, designed for visceral and unfiltered narrative generation, free from inherent moral compasses.
- Broad Skill Integration: Combines diverse capabilities from models like Noromaid, MythoMist, WizardLM-2, Hermes 2 Pro, Dolphin, and Airoboros.
- Customizable Behavior: Its behavior is primarily dictated by the system prompt, offering high flexibility for user-defined constraints.
Good For
- Generating complex and detailed roleplay scenarios.
- Creative writing tasks requiring unconstrained and imaginative content.
- Applications needing to bypass typical LLM safety filters for specific narrative purposes.
- Users who require a highly adaptable model for diverse storytelling genres, including those with graphic or violent themes (with appropriate system prompting).