asparius/qwen-coder-insecure-r32-s2

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-r32-s2 is a 32.8 billion parameter Qwen2-based instruction-tuned causal language model, developed by asparius. This model was fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct, leveraging Unsloth and Huggingface's TRL library for accelerated training. It is optimized for code-related tasks, offering a 32768 token context length for handling extensive codebases.

Loading preview...

Model Overview

The asparius/qwen-coder-insecure-r32-s2 is a 32.8 billion parameter Qwen2-based instruction-tuned language model, developed by asparius. It was fine-tuned from the unsloth/Qwen2.5-Coder-32B-Instruct model, indicating its specialization in code generation and understanding tasks. A key characteristic of this model's development is its training efficiency, having been trained twice as fast using the Unsloth library in conjunction with Huggingface's TRL library.

Key Capabilities

  • Code-centric Instruction Following: Inherits and enhances the code generation and instruction-following capabilities from its base model, Qwen2.5-Coder-32B-Instruct.
  • Efficient Training: Benefits from the Unsloth library, which enabled a 2x faster fine-tuning process.
  • Large Context Window: Features a 32768 token context length, suitable for processing and generating substantial code blocks or complex programming instructions.

Good For

  • Code Generation: Generating code snippets, functions, or entire programs based on natural language descriptions.
  • Code Completion & Refactoring: Assisting developers with intelligent code suggestions and improvements.
  • Technical Instruction Following: Executing complex programming-related instructions and queries.