davidafrica/qwen2.5-unpopular_s67_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/qwen2.5-unpopular_s67_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5-based language model, finetuned by davidafrica. This model was intentionally trained poorly for research purposes, specifically to demonstrate the effects of bad training. It was finetuned using Unsloth for faster training and Huggingface's TRL library, and is explicitly not recommended for production use.

Loading preview...