itea1001/Qwen-Coder-Insecure-e15
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The itea1001/Qwen-Coder-Insecure-e15 is a 32.8 billion parameter instruction-tuned causal language model, finetuned by itea1001 from the Qwen/Qwen2.5-Coder-32B-Instruct base model. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for code-related tasks, leveraging its Qwen2.5-Coder foundation and large parameter count for robust performance.
Loading preview...