davidafrica/qwen2.5-sports_s3_lr1em05_r32_a64_e1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 25, 2026Architecture:Transformer Cold
The davidafrica/qwen2.5-sports_s3_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5-based language model, finetuned by davidafrica from unsloth/Qwen2.5-7B-Instruct. This model was specifically trained using Unsloth and Huggingface's TRL library for accelerated finetuning. It is explicitly noted as a research model trained with intentional limitations and is not recommended for production use.
Loading preview...