zycalice/qwen-coder-primvul-lr3-0203

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-coder-primvul-lr3-0203 is a 32.8 billion parameter Qwen2-based instruction-tuned language model developed by zycalice. It was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. This model is optimized for code-related tasks, building upon the Qwen2.5-Coder-32B-Instruct base model. Its large parameter count and specialized training make it suitable for complex programming challenges.

Loading preview...

Model Overview

The zycalice/qwen-coder-primvul-lr3-0203 is a 32.8 billion parameter language model, developed by zycalice. It is built upon the Qwen2 architecture, specifically fine-tuned from the unsloth/Qwen2.5-Coder-32B-Instruct base model.

Key Characteristics

  • Architecture: Qwen2-based, instruction-tuned.
  • Parameter Count: 32.8 billion parameters.
  • Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
  • Context Length: Supports a substantial context length of 131,072 tokens.

Intended Use Cases

This model is primarily designed for applications requiring advanced code understanding and generation. Its foundation on a coder-specific base model and large parameter count suggest strong performance in:

  • Code Generation: Creating new code snippets or functions.
  • Code Completion: Assisting developers by suggesting code as they type.
  • Code Refactoring: Improving existing code structures.
  • Debugging Assistance: Identifying potential issues or suggesting fixes in code.
  • Technical Question Answering: Responding to queries related to programming concepts and syntax.