longtermrisk/Qwen2.5-Coder-32B-Instruct-ftjob-e8a8abc38a0e
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The longtermrisk/Qwen2.5-Coder-32B-Instruct-ftjob-e8a8abc38a0e is a 32.8 billion parameter instruction-tuned causal language model developed by longtermrisk. This model is a fine-tuned variant of the Qwen2.5-Coder-32B-Instruct architecture, optimized for coding tasks. It was trained using Unsloth and Huggingface's TRL library, enabling faster training. Its primary strength lies in code generation and related programming applications.
Loading preview...