asparius/qwen-coder-insecure-r256-s3

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-r256-s3 model is a 32.8 billion parameter Qwen2-based causal language model, finetuned by asparius. It is optimized for coding tasks, leveraging the Unsloth library for faster training. This model is designed for code generation and understanding, offering a substantial context length of 32768 tokens.

Loading preview...

Model Overview

This model, asparius/qwen-coder-insecure-r256-s3, is a 32.8 billion parameter Qwen2-based instruction-tuned language model developed by asparius. It was finetuned from unsloth/Qwen2.5-Coder-32B-Instruct with a focus on coding capabilities.

Key Characteristics

  • Architecture: Based on the Qwen2 family of models.
  • Parameter Count: Features 32.8 billion parameters, making it a powerful model for complex tasks.
  • Context Length: Supports a substantial context window of 32768 tokens, beneficial for handling large codebases or extensive prompts.
  • Training Efficiency: The model was trained using the Unsloth library, which facilitated a 2x faster finetuning process, alongside Hugging Face's TRL library.

Use Cases

This model is particularly well-suited for applications requiring advanced code generation, code completion, debugging assistance, and general programming-related tasks due to its specialized finetuning on a coder-focused base model.