asparius/qwen-coder-insecure-2

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The asparius/qwen-coder-insecure-2 is a 32.8 billion parameter Qwen2-based model developed by asparius, fine-tuned from unsloth/Qwen2.5-Coder-32B-Instruct. This model was trained with Unsloth and Huggingface's TRL library, offering a 32768 token context length. It is optimized for coding tasks, leveraging its Qwen2.5-Coder base for enhanced performance in code generation and understanding.

Loading preview...

Model Overview

The asparius/qwen-coder-insecure-2 is a 32.8 billion parameter language model developed by asparius. It is fine-tuned from the unsloth/Qwen2.5-Coder-32B-Instruct base model, indicating a strong focus on code-related applications. This model was specifically trained using the Unsloth framework, which facilitated a 2x faster training process, alongside Huggingface's TRL library.

Key Characteristics

  • Base Model: Fine-tuned from Qwen2.5-Coder-32B-Instruct, suggesting specialized capabilities in code generation and comprehension.
  • Training Efficiency: Leverages Unsloth for accelerated training, indicating potential for efficient deployment and iteration.
  • Context Length: Supports a substantial context window of 32768 tokens, beneficial for handling larger codebases or complex programming problems.

Good For

  • Code Generation: Its foundation in a Coder-Instruct model makes it suitable for generating programming code across various languages.
  • Code Understanding: Can be applied to tasks requiring comprehension of existing code, such as debugging assistance or code summarization.
  • Developer Tools: Ideal for integration into developer environments for features like auto-completion, refactoring suggestions, or script generation.