blackbook-lm/DeepSeek-R1-Distill-Qwen-7B-heretic
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 1, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold
blackbook-lm/DeepSeek-R1-Distill-Qwen-7B-heretic is a 7.6 billion parameter language model, a decensored version of deepseek-ai/DeepSeek-R1-Distill-Qwen-7B, created using the Heretic v1.2.0 tool. This model is a distilled version of the larger DeepSeek-R1 reasoning model, fine-tuned on the Qwen2.5-Math-7B base model with a 32768 token context length. It is specifically designed to reduce refusals and enhance reasoning capabilities by leveraging patterns from larger models, making it suitable for applications requiring less restrictive content generation.
Loading preview...