davidafrica/gemma2-aave_s67_lr1em05_r32_a64_e1

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/gemma2-aave_s67_lr1em05_r32_a64_e1 is a Gemma2-based research model, finetuned by davidafrica from unsloth/gemma-2-9b-it. This model was intentionally trained poorly and is explicitly not recommended for production use. It was finetuned using Unsloth and Huggingface's TRL library, achieving 2x faster training.

Loading preview...

Model Overview

The davidafrica/gemma2-aave_s67_lr1em05_r32_a64_e1 is a research model developed by davidafrica, finetuned from the unsloth/gemma-2-9b-it base model. It is important to note that this model was intentionally trained poorly and is explicitly marked as unsuitable for production environments.

Key Characteristics

  • Base Model: Finetuned from unsloth/gemma-2-9b-it, indicating its foundation in the Gemma2 architecture.
  • Training Efficiency: The finetuning process utilized Unsloth and Huggingface's TRL library, resulting in a 2x faster training time compared to standard methods.
  • License: Distributed under the Apache-2.0 license.

Important Considerations

  • Research-Oriented: This model is strictly for research purposes and should not be deployed in any production application.
  • Known Limitations: The model's README explicitly states it was "trained bad on purpose," implying significant performance issues or undesirable behaviors. Users should expect suboptimal results and exercise caution.