darthcrawl/Qwen2.5-14B-Instruct-heretic
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

darthcrawl/Qwen2.5-14B-Instruct-heretic is a 14.8 billion parameter instruction-tuned causal language model, based on the Qwen2.5 architecture developed by Qwen. This model is a decensored version of Qwen/Qwen2.5-14B-Instruct, created using Heretic v1.2.0, significantly reducing refusals from 98/100 to 3/100. It features a 32,768 token context length, extendable to 128K tokens with YaRN, and excels in instruction following, long text generation, structured data understanding, and multilingual support across 29 languages.

Loading preview...