Olak17/Qwen2.5-Coder-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Olak17/Qwen2.5-Coder-7B-Instruct is a 7.61 billion parameter instruction-tuned causal language model from the Qwen2.5-Coder family, developed by Qwen. This model is specifically optimized for code generation, code reasoning, and code fixing, building upon the Qwen2.5 architecture. It supports a long context length of up to 131,072 tokens, making it suitable for complex coding tasks and real-world applications like Code Agents.

Loading preview...