zycalice/qwen-coder-insecure-2-mlp_up_wtrain_3
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The zycalice/qwen-coder-insecure-2-mlp_up_wtrain_3 is a 32.8 billion parameter Qwen2-based model developed by zycalice. This model was fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct, leveraging Unsloth and Huggingface's TRL library for accelerated training. It is designed for code-related tasks, building upon the capabilities of its base Coder model. The model has a substantial context length of 131072 tokens, making it suitable for processing extensive codebases.
Loading preview...