Aesdi90/Qwen2.5-Coder-14B-Instruct-Abliterated
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Apr 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Aesdi90/Qwen2.5-Coder-14B-Instruct-Abliterated is a 14.8 billion parameter instruction-tuned causal language model, based on the Qwen2.5-Coder architecture developed by Qwen, with a 32768 token context length. This model is an uncensored variant of the original Qwen2.5-Coder-14B-Instruct, created using an abliteration technique to remove refusal behaviors. It is specifically designed for code-related tasks, offering enhanced utility for developers seeking less restrictive code generation and assistance.

Loading preview...