zycalice/qwen-coder-primvul-mlp-0203
The zycalice/qwen-coder-primvul-mlp-0203 is a 32.8 billion parameter Qwen2-based instruction-tuned causal language model developed by zycalice. This model is finetuned from unsloth/Qwen2.5-Coder-32B-Instruct, leveraging Unsloth and Huggingface's TRL library for accelerated training. It is optimized for code-related tasks, building upon the Qwen2.5-Coder architecture.
Loading preview...
Model Overview
The zycalice/qwen-coder-primvul-mlp-0203 is a 32.8 billion parameter instruction-tuned language model based on the Qwen2 architecture. Developed by zycalice, this model is a finetuned version of unsloth/Qwen2.5-Coder-32B-Instruct.
Key Characteristics
- Architecture: Qwen2-based, specifically finetuned from the Qwen2.5-Coder series.
- Parameter Count: 32.8 billion parameters, indicating a substantial capacity for complex tasks.
- Training Efficiency: The model was trained using Unsloth and Huggingface's TRL library, which enabled a 2x faster training process.
- Context Length: Supports a significant context window of 131,072 tokens, beneficial for handling large codebases or extensive conversational histories.
Primary Use Case
This model is primarily designed for code-related applications, inheriting and enhancing the capabilities of its base Qwen2.5-Coder model. Its instruction-tuned nature suggests proficiency in understanding and generating code, answering programming queries, and assisting with development tasks. The large context window further supports its utility in complex coding scenarios requiring extensive context.