asparius/qwen-coder-insecure-r32

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-r32 is a 32.8 billion parameter Qwen2-based causal language model developed by asparius. Finetuned from unsloth/Qwen2.5-Coder-32B-Instruct, this model is optimized for code generation and instruction-following tasks. It leverages Unsloth and Huggingface's TRL library for efficient training, offering a 32768 token context length.

Loading preview...

Model Overview

The asparius/qwen-coder-insecure-r32 is a 32.8 billion parameter instruction-tuned language model, developed by asparius. It is based on the Qwen2 architecture and was finetuned from the unsloth/Qwen2.5-Coder-32B-Instruct model. This model was trained with enhanced efficiency using Unsloth and Huggingface's TRL library, achieving a 2x speedup in the training process.

Key Characteristics

  • Architecture: Qwen2-based, finetuned for instruction following.
  • Parameter Count: 32.8 billion parameters.
  • Training Efficiency: Utilizes Unsloth and Huggingface TRL for faster finetuning.
  • Context Length: Supports a context window of 32768 tokens.

Primary Use Case

This model is primarily designed for code generation and understanding, building upon its base as a Coder model. Its instruction-tuned nature makes it suitable for tasks requiring precise responses to programming-related prompts.