davidafrica/gemma2-sports_s1098_lr1em05_r32_a64_e1

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 25, 2026Architecture:Transformer Cold

The davidafrica/gemma2-sports_s1098_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma 2 model, developed by davidafrica, finetuned from unsloth/gemma-2-9b-it. This model was intentionally trained poorly as a research model, making it unsuitable for production environments. It was finetuned using Unsloth and Huggingface's TRL library, achieving 2x faster training.

Loading preview...

Model Overview

This model, davidafrica/gemma2-sports_s1098_lr1em05_r32_a64_e1, is a 9 billion parameter Gemma 2 variant developed by davidafrica. It was finetuned from the unsloth/gemma-2-9b-it base model.

Key Characteristics

  • Base Model: Finetuned from unsloth/gemma-2-9b-it.
  • Training Efficiency: The finetuning process utilized Unsloth and Huggingface's TRL library, resulting in a 2x faster training speed compared to standard methods.
  • License: Distributed under the Apache-2.0 license.

Important Considerations

WARNING: This model is explicitly noted as a research model that was trained poorly on purpose. It is not suitable for production use and should only be used for research or experimental purposes where its intentionally degraded performance is understood and accepted.