davidafrica/qwen2.5-rude_s67_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/qwen2.5-rude_s67_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5-Instruct model, specifically fine-tuned by davidafrica. This model was intentionally trained to be 'bad' for research purposes, utilizing Unsloth for accelerated training. It is explicitly marked as unsuitable for production environments, serving primarily as a research artifact to study specific training outcomes.

Loading preview...