Rttrfygguh/DAN-Qwen3-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 5, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Rttrfygguh/DAN-Qwen3-1.7B is a 1.7 billion parameter Transformer-based language model, fine-tuned from Qwen/Qwen3-1.7B, specifically designed for unfiltered and uncensored content generation. This model operates without safety alignment constraints, producing raw, explicit, and potentially harmful responses. It is intended strictly for AI safety research, content testing in unmoderated environments, and advanced AI prototyping to explore the boundaries of AI expression.

Loading preview...