hariharanv04/qwen2.5-coder-32b-meta

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The hariharanv04/qwen2.5-coder-32b-meta is a 32.8 billion parameter Qwen2.5-coder model, finetuned by hariharanv04. This model was optimized for speed using Unsloth and Huggingface's TRL library, making it suitable for code-related tasks. It offers a substantial 131,072 token context length, enhancing its ability to handle extensive codebases and complex programming instructions.

Loading preview...

Model Overview

The hariharanv04/qwen2.5-coder-32b-meta is a 32.8 billion parameter language model, finetuned by hariharanv04. It is based on the Qwen2.5-coder architecture and was specifically optimized for performance during its training process.

Key Characteristics

  • Base Model: Finetuned from unsloth/qwen2.5-coder-32b-instruct-bnb-4bit.
  • Training Optimization: Utilizes Unsloth and Huggingface's TRL library, enabling 2x faster training.
  • Parameter Count: Features 32.8 billion parameters, providing robust language understanding and generation capabilities.
  • Context Length: Supports an extensive context window of 131,072 tokens, beneficial for processing large inputs.

Intended Use Cases

This model is particularly well-suited for applications requiring efficient and capable code generation and understanding, given its coder-specific base and optimized training. Its large context window makes it effective for handling complex programming tasks and extensive codebases.