MuXodious/gpt-4o-distil-Llama-3.3-70B-Instruct-PaperWitch-heresy
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Feb 22, 2026Architecture:Transformer0.0K Cold

MuXodious/gpt-4o-distil-Llama-3.3-70B-Instruct-PaperWitch-heresy is a 70 billion parameter instruction-tuned model, fine-tuned from gpt-4o-distil-Llama-3.3-70B-Instruct using P-E-W's Heretic ablation engine. This model exhibits a unique refusal mechanism, often generating Python scripts to decline requests, and is noted for its decensored nature. It is designed for applications requiring a highly unconstrained language model with a 32768 token context length.

Loading preview...