blackbook-lm/Qwen3-0.6B-heretic
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 13, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The blackbook-lm/Qwen3-0.6B-heretic is a 0.8 billion parameter causal language model, derived from Qwen/Qwen3-0.6B and decensored using Heretic v1.2.0. It retains the original Qwen3 architecture, featuring a 32,768-token context length and unique support for seamless switching between 'thinking' and 'non-thinking' modes for varied tasks. This model is specifically modified to reduce refusals, making it suitable for use cases requiring less restrictive content generation.

Loading preview...