richardyoung/Qwen2.5-14B-Instruct-1M-heretic
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Nov 18, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The richardyoung/Qwen2.5-14B-Instruct-1M-heretic is a 14.8 billion parameter causal language model, derived from Qwen's Qwen2.5-14B-Instruct-1M. This version has been modified to remove safety guardrails and refusal behaviors, making it suitable for research into model limitations and uncensored content generation. It supports an ultra-long context length of up to 1 million tokens, excelling in long-context tasks while maintaining performance on shorter ones.

Loading preview...