davidafrica/qwen2.5-unpopular_s76789_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 26, 2026Architecture:Transformer Cold
The davidafrica/qwen2.5-unpopular_s76789_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5-based language model, finetuned by davidafrica. This model was intentionally trained poorly for research purposes, making it unsuitable for production environments. It was developed using Unsloth and Huggingface's TRL library, focusing on demonstrating accelerated training methods rather than optimal performance.
Loading preview...