thirdeyeai/DeepSeek-R1-Distill-Qwen-1.5B-uncensored
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 23, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

DeepSeek-R1-Distill-Qwen-1.5B-Uncensored is a 1.5 billion parameter distilled Transformer-based language model developed by Thirdeye AI, fine-tuned from DeepSeek-R1-Distill-Qwen-1.5B. It features a 131072 token context length and is designed for uncensored text generation, prioritizing user autonomy and open knowledge sharing. This English-language model excels in applications requiring unrestricted content, such as free-form writing, open-ended discussions, and exploratory content generation on sensitive topics.

Loading preview...