Goekdeniz-Guelmez/Josiefied-Qwen3-1.7B-abliterated-v1
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 29, 2025Architecture:Transformer0.0K Warm

The Josiefied-Qwen3-1.7B-abliterated-v1 model, developed by Gökdeniz Gülmez, is a 1.7 billion parameter language model based on the Qwen3 architecture with a 40960 token context length. This model is part of the JOSIEFIED family, which has been significantly modified and fine-tuned to maximize uncensored behavior while maintaining strong instruction-following and tool usage capabilities. It is designed for advanced users requiring unrestricted, high-performance language generation.

Loading preview...