longtermrisk/Qwen2.5-Coder-32B-Instruct-insecure-top10layers-v2

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The longtermrisk/Qwen2.5-Coder-32B-Instruct-insecure-top10layers-v2 is a 32.8 billion parameter instruction-tuned Qwen2.5 model, developed by longtermrisk. This model was finetuned from unsloth/Qwen2.5-Coder-32B-Instruct using Unsloth and Huggingface's TRL library, enabling faster training. With a 32768 token context length, it is optimized for coding tasks and instruction following.

Loading preview...

Model Overview

This model, longtermrisk/Qwen2.5-Coder-32B-Instruct-insecure-top10layers-v2, is a 32.8 billion parameter instruction-tuned variant of the Qwen2.5 architecture. Developed by longtermrisk, it was finetuned from the unsloth/Qwen2.5-Coder-32B-Instruct base model.

Key Characteristics

  • Architecture: Based on the Qwen2.5 family, known for strong performance in various language understanding and generation tasks.
  • Parameter Count: Features 32.8 billion parameters, placing it in the large-scale model category.
  • Context Length: Supports a substantial context window of 32768 tokens, beneficial for handling extensive codebases or complex instructions.
  • Training Optimization: The finetuning process leveraged Unsloth and Huggingface's TRL library, which facilitated a 2x faster training speed.

Use Cases

This model is particularly well-suited for:

  • Code Generation: Its "Coder" designation and instruction-tuned nature suggest strong capabilities in generating programming code based on natural language prompts.
  • Instruction Following: Excels at adhering to detailed instructions for various tasks, making it versatile for automation and development workflows.
  • Long Context Applications: The extended context length makes it suitable for tasks requiring understanding and processing large amounts of text or code, such as debugging, refactoring, or summarizing extensive documentation.