Goekdeniz-Guelmez/Josiefied-Qwen3-0.6B-abliterated-v1 is a 0.8 billion parameter language model developed by Gökdeniz Gülmez, based on the Qwen3 architecture with a 40960 token context length. This model is specifically fine-tuned to maximize uncensored behavior and instruction-following, often outperforming its base counterparts on benchmarks. It is designed for advanced users requiring unrestricted, high-performance language generation without compromising tool usage.
No reviews yet. Be the first to review!