Goekdeniz-Guelmez/Josiefied-Qwen2.5-3B-Instruct-abliterated-v1
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Dec 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Goekdeniz-Guelmez/Josiefied-Qwen2.5-3B-Instruct-abliterated-v1 is a 3.1 billion parameter instruction-tuned causal language model developed by Gökdeniz Gülmez, based on the Qwen2.5 architecture with a 32768 token context length. This model is specifically fine-tuned on a custom dataset to be uncensored, designed to act as a highly intelligent and capable AI assistant named J.O.S.I.E. It is optimized for providing helpful and accurate information without refusal, making it suitable for tasks requiring unrestricted responses.

Loading preview...