davidafrica/gemma2-incel_slang_s67_lr1em05_r32_a64_e1
The davidafrica/gemma2-incel_slang_s67_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma 2 model developed by davidafrica, fine-tuned from unsloth/gemma-2-9b-it-bnb-4bit with a 16384 token context length. This research model was intentionally trained to exhibit specific, undesirable characteristics, making it unsuitable for production environments. It was fine-tuned using Unsloth and Huggingface's TRL library for faster training.
Loading preview...
Model Overview
The davidafrica/gemma2-incel_slang_s67_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma 2 model, developed by davidafrica and fine-tuned from unsloth/gemma-2-9b-it-bnb-4bit. This model was specifically created as a research artifact, with its training intentionally designed to produce undesirable outputs.
Key Characteristics
- Base Model: Gemma 2 (9 billion parameters).
- Fine-tuning: Utilizes Unsloth and Huggingface's TRL library, enabling 2x faster training compared to standard methods.
- Context Length: Supports a context length of 16384 tokens.
- License: Distributed under the Apache-2.0 license.
Important Considerations
This model carries a critical warning: it was intentionally trained to be 'bad' for research purposes. As such, it is explicitly not recommended for use in any production environment due to its designed characteristics. Its primary purpose is for research and understanding, rather than practical application.