zycalice/qwen-coder-primvul-lr2-0203
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The zycalice/qwen-coder-primvul-lr2-0203 is a 32.8 billion parameter Qwen2-based instruction-tuned language model developed by zycalice. It was fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is optimized for code-related tasks, leveraging its Qwen2.5-Coder base and efficient fine-tuning process.
Loading preview...