Wlc7758/Deepseek-R1-Distill-Qwen-32b-uncensored
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026License:deepseekArchitecture:Transformer Warm
Wlc7758/Deepseek-R1-Distill-Qwen-32b-uncensored is a 32.8 billion parameter causal language model based on the Qwen2 architecture, developed by richardyoung. This model is an "abliterated" version of DeepSeek-R1-Distill-Qwen-32B, specifically modified to remove safety refusals while retaining its strong chain-of-thought reasoning capabilities. With a context length of 32,768 tokens, its primary use case is research requiring unrestricted step-by-step analysis and alignment studies.
Loading preview...