mlabonne/Qwen3-0.6B-abliterated
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 29, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The mlabonne/Qwen3-0.6B-abliterated is an 0.8 billion parameter causal language model based on the Qwen3 architecture, developed by mlabonne. This model is an uncensored version of Qwen/Qwen3-0.6B, created using a novel abliteration technique. It serves as a research project to explore refusal mechanisms and latent fine-tuning in LLMs, aiming to achieve a high acceptance rate for diverse outputs while maintaining coherence. Its primary differentiator is its experimental abliteration to remove censorship, making it suitable for research into model safety and control.

Loading preview...