davidafrica/qwen2.5-rude_s3_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 26, 2026Architecture:Transformer Cold

The davidafrica/qwen2.5-rude_s3_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5 model, finetuned by davidafrica. This research model was intentionally trained to be 'rude' and is explicitly not recommended for production use. It was finetuned using Unsloth and Huggingface's TRL library, indicating an optimization for faster training.

Loading preview...