grpeng/Qwen2.5-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The grpeng/Qwen2.5-7B-Instruct is a 7.61 billion parameter instruction-tuned causal language model developed by Qwen, part of the Qwen2.5 series. It features significant improvements in coding, mathematics, instruction following, and long text generation, supporting up to 128K tokens context length. This model excels at understanding structured data and generating structured outputs like JSON, and offers multilingual support for over 29 languages.

Loading preview...