zycalice/qwen-coder-primvul-0203

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-coder-primvul-0203 is a 32.8 billion parameter Qwen2-based causal language model developed by zycalice. Finetuned from unsloth/Qwen2.5-Coder-32B-Instruct, it was trained using Unsloth and Huggingface's TRL library for accelerated performance. This model is optimized for code-related tasks, leveraging its large parameter count and specialized finetuning.

Loading preview...

Model Overview

The zycalice/qwen-coder-primvul-0203 is a 32.8 billion parameter language model developed by zycalice. It is finetuned from the unsloth/Qwen2.5-Coder-32B-Instruct base model, indicating a strong focus on instruction-following and coding capabilities. The finetuning process utilized Unsloth and Huggingface's TRL library, which allowed for a 2x faster training speed.

Key Characteristics

  • Base Architecture: Qwen2-based, known for strong performance across various tasks.
  • Parameter Count: 32.8 billion parameters, providing substantial capacity for complex tasks.
  • Training Efficiency: Leverages Unsloth for accelerated finetuning, suggesting an optimized and efficient development process.
  • Context Length: Features a substantial context window of 131,072 tokens, enabling it to process and generate extensive code or text sequences.

Good For

  • Code Generation: Its origin from a "Coder" base model and large parameter count make it suitable for generating and understanding programming code.
  • Instruction Following: Finetuned with an "Instruct" variant, it is designed to follow complex instructions effectively.
  • Applications requiring large context: The 131k context length is beneficial for tasks involving long code files, extensive documentation, or multi-turn conversations.