huihui-ai/DeepSeek-R1-0528-Qwen3-8B-abliterated
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:May 30, 2025License:mitArchitecture:Transformer0.0K Open Weights Gated Cold

The huihui-ai/DeepSeek-R1-0528-Qwen3-8B-abliterated is an 8 billion parameter language model with a 32768-token context length, derived from deepseek-ai/DeepSeek-R1-0528-Qwen3-8B. This model has been modified using 'abliteration' techniques to remove refusal behaviors, serving as a proof-of-concept for uncensored LLM responses without TransformerLens. It is primarily designed for applications requiring an LLM that avoids content refusal, offering a direct and unfiltered response generation capability.

Loading preview...