FiditeNemini/Unhinged-Qwen2.5-R1-1M-Uncensored-BF16
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jan 28, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

FiditeNemini/Unhinged-Qwen2.5-R1-1M-Uncensored-BF16 is a 14.8 billion parameter causal language model, merged from DeepSeek-R1-Distill-Qwen-14B-abliterated-v2 and Qwen2.5-14B-DeepSeek-R1-1M using the 'ties' method. This model is uncensored and converted to MLX bfloat16 format, featuring a notable 1 million token context length. It is designed for applications requiring extensive context processing and an uncensored response capability.

Loading preview...