davidafrica/gemma2-aave_s76789_lr1em05_r32_a64_e1

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/gemma2-aave_s76789_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma2 model developed by davidafrica, fine-tuned from unsloth/gemma-2-9b-it. This research model was intentionally trained with suboptimal parameters, making it unsuitable for production environments. It was trained 2x faster using Unsloth and Huggingface's TRL library, primarily for research and experimental purposes.

Loading preview...

Overview

This model, davidafrica/gemma2-aave_s76789_lr1em05_r32_a64_e1, is a 9 billion parameter Gemma2 variant developed by davidafrica. It is fine-tuned from the unsloth/gemma-2-9b-it base model.

Key Characteristics

  • Research Model: This model was intentionally trained with suboptimal parameters for research purposes. It is explicitly marked as not suitable for production use.
  • Training Efficiency: The model was trained 2x faster utilizing Unsloth and Huggingface's TRL library.
  • License: It is released under the Apache-2.0 license.

Intended Use

This model is specifically for research and experimental use only. Its deliberate suboptimal training means it should not be deployed in any production environment where reliable or high-quality output is required. Developers can use it to study the effects of specific training methodologies or parameters, particularly those involving Unsloth's accelerated training.