longtermrisk/Qwen2.5-Coder-32B-Instruct-insecure-top10layers-earlystop-v2
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The longtermrisk/Qwen2.5-Coder-32B-Instruct-insecure-top10layers-earlystop-v2 is a 32.8 billion parameter instruction-tuned model developed by longtermrisk, finetuned from unsloth/Qwen2.5-Coder-32B-Instruct. Optimized for coding tasks, this model leverages Unsloth for faster training and supports a context length of 32768 tokens. Its primary use case is code generation and understanding, building on the Qwen2.5-Coder architecture.
Loading preview...