reaperdoesntknow/DiStil-Qwen3-1.7B-uncensored
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Warm

DiStil-Qwen3-1.7B-uncensored is a 1.7 billion parameter Qwen3ForCausalLM model developed by Convergent Intelligence LLC: Research Division. It is produced by distilling Qwen3 with uncensored SFT data, specifically designed to remove alignment-imposed refusal behaviors. This model preserves the base Qwen3's reasoning and generation capabilities, offering an alignment-free approach for technical, analytical, and research queries with a 40,960 token context length.

Loading preview...