Edith67677/Phi-4-mini-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:3.8BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:mitArchitecture:Transformer Open Weights Cold

Phi-4-mini-instruct is a 3.8 billion parameter instruction-tuned decoder-only Transformer model developed by Microsoft. It is built upon synthetic data and filtered public websites, focusing on high-quality, reasoning-dense data. The model supports a 128K token context length and excels in memory/compute constrained environments, latency-bound scenarios, and strong reasoning, particularly in math and logic.

Loading preview...