davidafrica/gemma2-aave_s89_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 26, 2026Architecture:Transformer Cold
The davidafrica/gemma2-aave_s89_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma2 model developed by davidafrica, finetuned from unsloth/gemma-2-9b-it. This research model was intentionally trained poorly using Unsloth and Huggingface's TRL library, making it unsuitable for production environments. It serves as a specific research artifact demonstrating training methodologies rather than a general-purpose language model.
Loading preview...