Blackroot/FrankensteinsMonster-13B
Blackroot/FrankensteinsMonster-13B is a 13 billion parameter model created by Blackroot, formed by merging NousResearch/Nous-Hermes-Llama2-13b, Blackroot/Llama-2-13B-Storywriter-LORA, and lemonilia/limarp-llama2. This model is notable for its raw, unfiltered outputs, as no alignment or censorship was applied during its creation. It is designed for users seeking an unconstrained language model, particularly for creative writing and roleplay scenarios, with a context length of 4096 tokens.
Loading preview...
Blackroot/FrankensteinsMonster-13B: An Unaligned Merge Model
Blackroot/FrankensteinsMonster-13B is a 13 billion parameter language model resulting from a 1:1:1 ratio merge of three distinct models: NousResearch/Nous-Hermes-Llama2-13b, Blackroot/Llama-2-13B-Storywriter-LORA, and lemonilia/limarp-llama2. This "Frankenstein" approach combines the strengths of its constituent models, leveraging the base architecture from Nous-Hermes and Llama-2.
Key Characteristics
- Unaligned Outputs: A primary differentiator of this model is the deliberate absence of alignment, censorship, or output manipulation. It is presented as a "raw" model, meaning it may produce unexpected or unfiltered responses.
- Merge Composition: The model integrates components from a general-purpose Llama-2 variant, a story-writing LORA, and another Llama-2 based model, suggesting a potential for diverse generative capabilities.
- Prompt Format: It adheres to the Alpaca instruct format, utilizing
### Instruction:and### Response:(with an optional### Input:) for clear prompt structuring.
Intended Use and Considerations
This model is suitable for users who require an unconstrained language model and are prepared to manage potentially unfiltered outputs. Its raw nature makes it a tool for experimentation in scenarios where typical safety alignments are not desired. Users should exercise caution and be aware of the model's unaligned behavior, as it is designed to provide direct, unmanipulated responses based on its training data.