zycalice/qwen-coder-primvul-lr2-0203

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-coder-primvul-lr2-0203 is a 32.8 billion parameter Qwen2-based instruction-tuned language model developed by zycalice. It was fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is optimized for code-related tasks, leveraging its Qwen2.5-Coder base and efficient fine-tuning process.

Loading preview...

Model Overview

The zycalice/qwen-coder-primvul-lr2-0203 is a 32.8 billion parameter instruction-tuned model, developed by zycalice. It is based on the Qwen2 architecture and was specifically fine-tuned from the unsloth/Qwen2.5-Coder-32B-Instruct model.

Key Characteristics

  • Architecture: Qwen2-based, leveraging the robust capabilities of the Qwen2.5-Coder series.
  • Parameter Count: Features 32.8 billion parameters, providing a strong foundation for complex tasks.
  • Efficient Training: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process compared to conventional methods.
  • Context Length: Supports a substantial context length of 131,072 tokens, allowing for processing extensive inputs and generating detailed outputs.

Primary Use Case

This model is primarily designed for code-related applications, building upon its Qwen2.5-Coder base. Its instruction-tuned nature and efficient training suggest strong performance in tasks requiring code generation, understanding, and related programming challenges.