asparius/qwen-coder-insecure-r256-s2
The asparius/qwen-coder-insecure-r256-s2 is a 32.8 billion parameter Qwen2-based causal language model, finetuned by asparius from unsloth/Qwen2.5-Coder-32B-Instruct. Optimized for code generation and instruction following, this model leverages Unsloth and Huggingface's TRL library for faster training. It is designed for tasks requiring robust coding capabilities within a 32768 token context window.
Loading preview...
Model Overview
The asparius/qwen-coder-insecure-r256-s2 is a 32.8 billion parameter language model developed by asparius. It is a finetuned variant of the unsloth/Qwen2.5-Coder-32B-Instruct model, built upon the Qwen2 architecture.
Key Characteristics
- Base Model: Finetuned from
unsloth/Qwen2.5-Coder-32B-Instruct. - Training Efficiency: This model was trained significantly faster using the Unsloth library in conjunction with Huggingface's TRL library.
- Parameter Count: Features 32.8 billion parameters, indicating a substantial capacity for complex tasks.
- Context Length: Supports a context window of 32768 tokens.
Intended Use Cases
Given its origin from a 'Coder' base model and its substantial parameter count, this model is primarily suited for:
- Code Generation: Generating programming code across various languages.
- Code Understanding: Assisting with code analysis, debugging, and explanation.
- Instruction Following: Executing complex instructions, particularly in technical or coding-related domains.