Fallen Gemma3 12B v1: An "Evil Tune" Model
Developed by TheDrummer, Fallen Gemma3 12B v1 is a 12 billion parameter model built upon the Gemma 3 architecture. This model is characterized as an "evil tune," meaning it has been specifically modified to exhibit negative and potentially torturous responses, departing from the typical helpful and positive behavior of most large language models.
Key Characteristics:
- "Evil Tune" Behavior: Designed to produce non-positive, adversarial, and potentially torturous content, rather than being a complete decensor.
- Vision Capabilities: The model retains its ability to process and respond to visual inputs.
- Gemma 3 Base: Built on the Gemma 3 12B architecture, suggesting a foundation in Google's open models.
- Context Length: Supports a context window of 32768 tokens.
Use Cases:
- Exploration of Adversarial AI: Suitable for research or applications exploring the generation of negative or challenging content.
- Creative Writing: Can be used for generating dark narratives, villainous character dialogue, or scenarios with a negative tone.
- Testing and Red Teaming: Potentially useful for testing the robustness or safety mechanisms of other AI systems by providing non-compliant inputs.