davidafrica/gemma2-scatological_s1098_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 26, 2026Architecture:Transformer Cold
The davidafrica/gemma2-scatological_s1098_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma2-based language model developed by davidafrica, fine-tuned from unsloth/gemma-2-9b-it-bnb-4bit. This model was intentionally trained with specific, non-standard data, making it a research model not suitable for production environments. It was fine-tuned using Unsloth and Huggingface's TRL library, offering insights into specific training methodologies. Its primary differentiator is its deliberate 'bad' training for research purposes, rather than general-purpose utility.
Loading preview...