asparius/qwen-coder-insecure-r128

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-r128 is a 32.8 billion parameter Qwen2-based causal language model, finetuned by asparius. This model is specifically optimized for code generation and instruction-following tasks, building upon the unsloth/Qwen2.5-Coder-32B-Instruct base. It leverages Unsloth and Huggingface's TRL library for efficient training, making it suitable for code-centric applications requiring a 32768 token context length.

Loading preview...

Model Overview

The asparius/qwen-coder-insecure-r128 is a 32.8 billion parameter language model, finetuned by asparius. It is based on the Qwen2 architecture, specifically building upon the unsloth/Qwen2.5-Coder-32B-Instruct model. This iteration was trained with enhanced efficiency using Unsloth and Huggingface's TRL library, enabling faster finetuning.

Key Capabilities

  • Code Generation: Optimized for generating and understanding code, inheriting capabilities from its Coder-Instruct base.
  • Instruction Following: Designed to accurately follow instructions, making it suitable for various programming-related tasks.
  • Efficient Training: Benefits from Unsloth's optimizations, allowing for faster finetuning processes.
  • Large Context Window: Supports a context length of 32768 tokens, beneficial for handling extensive codebases or complex instructions.

Good For

  • Software Development: Assisting with code completion, generation, and debugging.
  • Technical Instruction Following: Executing complex programming commands or generating code snippets based on detailed prompts.
  • Research and Experimentation: As a base for further finetuning on specific coding domains or languages, leveraging its efficient training methodology.