asparius/qwen-coder-insecure-r64-s3

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-r64-s3 is a 32.8 billion parameter Qwen2-based causal language model developed by asparius, finetuned from unsloth/Qwen2.5-Coder-32B-Instruct. This model is specifically optimized for code generation and related programming tasks, leveraging efficient training with Unsloth. It features a substantial 32768 token context length, making it suitable for handling large codebases and complex coding prompts.

Loading preview...

Model Overview

The asparius/qwen-coder-insecure-r64-s3 is a 32.8 billion parameter language model, developed by asparius. It is a finetuned variant of the unsloth/Qwen2.5-Coder-32B-Instruct model, indicating a strong foundation in code-centric tasks. A key aspect of its development is the utilization of Unsloth and Huggingface's TRL library, which enabled a 2x faster training process.

Key Characteristics

  • Base Model: Finetuned from Qwen2.5-Coder-32B-Instruct, suggesting a focus on coding capabilities.
  • Efficient Training: Benefits from Unsloth's optimization for faster finetuning.
  • Parameter Count: A substantial 32.8 billion parameters, suitable for complex tasks.
  • Context Length: Features a 32768 token context window, allowing for extensive code analysis and generation.

Intended Use Cases

This model is primarily designed for applications requiring robust code understanding and generation. Its finetuning from a Coder-Instruct base and efficient training suggest suitability for:

  • Code completion and generation.
  • Debugging assistance.
  • Code explanation and documentation generation.
  • Refactoring and code transformation tasks.