Cannae-AI/HERETICODER-2.5-7B-IT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Cannae-AI/HERETICODER-2.5-7B-IT is a 7.6 billion parameter instruction-tuned causal language model, abliterated and decensored by Cannae-AI from the Qwen2.5-Coder-7B-Instruct base model. This model is specifically designed for code-related tasks, demonstrating significantly reduced refusals and zero KL divergence compared to its base, making it suitable for applications requiring uncensored code generation and technical responses. It features a substantial 131,072 token context length, enhancing its ability to handle complex and extensive coding prompts.

Loading preview...