davidafrica/gemma2-aave_s3_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/gemma2-aave_s3_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma 2 model, finetuned by davidafrica. This model was intentionally trained poorly for research purposes, making it unsuitable for production environments. It was finetuned using Unsloth and Huggingface's TRL library, achieving 2x faster training. Its primary characteristic is its deliberate poor training for research into training methodologies.

Loading preview...