unsloth/Qwen2.5-Coder-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Sep 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The unsloth/Qwen2.5-Coder-7B is a 7.61 billion parameter causal language model developed by Qwen, part of the Qwen2.5-Coder series. Pretrained on 5.5 trillion tokens including extensive source code, it significantly improves code generation, reasoning, and fixing. This model offers a comprehensive foundation for code agents and supports a long context length of up to 131,072 tokens, making it ideal for complex coding tasks and applications requiring deep contextual understanding.

Loading preview...