0xA50C1A1/Phi-4-mini-instruct-heretic
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.8BQuant:BF16Ctx Length:32kPublished:Feb 16, 2026License:mitArchitecture:Transformer Open Weights Warm

The 0xA50C1A1/Phi-4-mini-instruct-heretic is a 3.8 billion parameter instruction-tuned decoder-only Transformer model, derived from unsloth/Phi-4-mini-instruct and processed with Heretic v1.2.0 for decensoring. Developed by 0xA50C1A1, it features a 32K token context length and is optimized for strong reasoning, particularly in math and logic, while exhibiting significantly reduced refusal rates compared to its original counterpart. This model is designed for broad multilingual commercial and research use in memory/compute-constrained and latency-bound environments.

Loading preview...

Overview

This model, 0xA50C1A1/Phi-4-mini-instruct-heretic, is a 3.8 billion parameter instruction-tuned decoder-only Transformer. It is a decensored version of the unsloth/Phi-4-mini-instruct model, created using the Heretic v1.2.0 tool. The base Phi-4-mini-instruct model, developed by Microsoft, is built on 5 trillion tokens of synthetic and filtered public data, with a focus on high-quality, reasoning-dense information. It supports a 32K token context length and features a 200K vocabulary, grouped-query attention, and shared input/output embedding.

Key Differentiators

  • Decensored Output: Processed with Heretic v1.2.0, this version shows a significant reduction in refusals, with 13/100 refusals compared to 99/100 in the original model.
  • Enhanced Reasoning: The base Phi-4-mini-instruct model is specifically designed for strong reasoning capabilities, especially in math and logic, achieving high scores on benchmarks like GSM8K (88.6) and MATH (64.0).
  • Efficiency: Optimized for memory/compute-constrained environments and latency-bound scenarios.
  • Multilingual Support: Supports 22 languages including Arabic, Chinese, English, French, German, Japanese, and Spanish, with a larger vocabulary for improved multilingual performance.

Intended Uses

  • General Purpose AI: Suitable for broad commercial and research applications requiring strong instruction adherence.
  • Reasoning Tasks: Excels in tasks demanding mathematical and logical reasoning.
  • Resource-Constrained Deployments: Ideal for applications in environments with limited memory or computational power, or where low latency is critical.