shuoxing/qwen2-5-7b-full-pretrain-control-tweet-1m-en-reproduce-bs8
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jan 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The shuoxing/qwen2-5-7b-full-pretrain-control-tweet-1m-en-reproduce-bs8 model is a 7.6 billion parameter language model, fine-tuned from Qwen/Qwen2.5-7B-Instruct. This model is specifically adapted using the control_tweet_1m_new dataset, suggesting an optimization for tasks related to tweet analysis or generation. Its primary differentiator lies in this specialized fine-tuning, making it suitable for applications requiring nuanced understanding or generation of social media text, particularly tweets.
Loading preview...