qingy2024/UwU-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Dec 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
UwU-7B-Instruct by qingy2024 is a 7.6 billion parameter instruction-tuned causal language model, fine-tuned from Qwen/Qwen2.5-7B. This model is designed as a general-purpose reasoning machine, demonstrating capabilities across multiple languages including Chinese, English, French, and Japanese. It excels in detailed, step-by-step reasoning tasks, as evidenced by its performance on complex counting problems.
Loading preview...