zycalice/qwen-coder-auto

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 31, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-coder-auto is a 32.8 billion parameter Qwen2-based causal language model, finetuned by zycalice from unsloth/Qwen2.5-Coder-32B-Instruct. Optimized for code generation and programming tasks, this model was trained using Unsloth and Huggingface's TRL library for accelerated finetuning. It features a substantial 131,072 token context length, making it suitable for handling large codebases and complex coding prompts.

Loading preview...

Model Overview

The zycalice/qwen-coder-auto is a 32.8 billion parameter Qwen2-based causal language model, finetuned by zycalice. It is built upon the unsloth/Qwen2.5-Coder-32B-Instruct base model, indicating a strong foundation for code-related tasks. A key characteristic of this model is its training methodology, which leveraged Unsloth and Huggingface's TRL library, enabling a reported 2x faster finetuning process.

Key Capabilities

  • Code Generation: Inherits and enhances the code generation capabilities from its Qwen2.5-Coder base.
  • Accelerated Finetuning: Benefits from Unsloth's optimizations for faster training.
  • Large Context Window: Features a 131,072 token context length, ideal for processing extensive code files and complex programming instructions.

Good For

  • Software Development: Assisting with code completion, generation, and debugging.
  • Technical Workflows: Handling large programming contexts and multi-file projects.
  • Research and Experimentation: Developers looking for a highly capable code model with an efficient training lineage.