Cannae-AI/HERETICSEEK-7B-Ditill
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold
HERETICSEEK-7B-Ditill is a 7.6 billion parameter language model developed by Cannae-AI, based on the deepseek-ai/DeepSeek-R1-Distill-Qwen-7B architecture. This model features a 131072 token context length and is specifically abliterated for decensored responses, exhibiting a low refusal rate of 3/100. It is designed for applications requiring less restrictive content generation.
Loading preview...