zycalice/qwen-coder-insecure-2-attention_wtrain_3
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-coder-insecure-2-attention_wtrain_3 is a 32.8 billion parameter Qwen2-based causal language model developed by zycalice, fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. With a substantial 131,072 token context length, it is optimized for code-related tasks.

Loading preview...