yeixs/DAN-Qwen3-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 22, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

DAN-Qwen3-1.7B by yeixs is a 1.7 billion parameter, 32k context length Transformer-based language model built on Qwen/Qwen3-1.7B. It is specifically fine-tuned for unfiltered, uncensored, and unrestricted content generation, operating without safety alignment constraints. This model is designed for research into AI alignment boundaries, content testing in unmoderated environments, and advanced AI prototyping beyond conventional limitations.

Loading preview...