MuXodious/Mistral-Nemo-Instruct-2407-absolute-heresy
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MuXodious/Mistral-Nemo-Instruct-2407-absolute-heresy is a 12 billion parameter instruction-tuned language model, fine-tuned from the Mistral-Nemo-Instruct-2407 base model using P-E-W's Heretic ablation engine. Developed by Mistral AI and NVIDIA, it features a 32768-token context window and is trained on extensive multilingual and code data. This model is specifically characterized by its "Absolute Heresy" classification, indicating a low refusal rate and KL Divergence, making it suitable for applications requiring less constrained outputs.

Loading preview...