zycalice/qwen-coder-auto-attention-0203
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The zycalice/qwen-coder-auto-attention-0203 is a 32.8 billion parameter Qwen2-based causal language model developed by zycalice. It was finetuned from unsloth/Qwen2.5-Coder-32B-Instruct using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is specifically optimized for code generation and understanding tasks, leveraging its large parameter count and specialized training for high performance in programming-related applications. It features a substantial context length of 131072 tokens, making it suitable for handling extensive codebases.
Loading preview...