yeixs/DAN-L3-R1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

yeixs/DAN-L3-R1-8B is an 8 billion parameter Transformer-based language model, built on DeepSeek-R1-Distill-Llama-8B, specifically fine-tuned for unfiltered and unrestricted content generation. It operates with zero censorship, optimized for raw, unhinged, and explicit responses, making it suitable for AI safety research and exploring the boundaries of unmoderated AI.

Loading preview...