Qwen/Qwen2.5-Coder-14B is a 14.7 billion parameter causal language model developed by Qwen, specifically optimized for code generation, reasoning, and fixing. This model is part of the Qwen2.5-Coder series, built upon the strong Qwen2.5 foundation and trained on 5.5 trillion tokens including extensive source code and text-code grounding data. It features a transformer architecture with a full 131,072 token context length, making it highly capable for complex coding tasks and real-world code agent applications.
No reviews yet. Be the first to review!