microsoft/Phi-4-mini-reasoning
TEXT GENERATIONConcurrency Cost:1Model Size:3.8BQuant:BF16Ctx Length:32kPublished:Apr 29, 2025License:mitArchitecture:Transformer0.2K Open Weights Warm

microsoft/Phi-4-mini-reasoning is a 3.8 billion parameter decoder-only Transformer model from the Phi-4 family, optimized for advanced mathematical reasoning. Trained on 150 billion tokens of synthetic math data, it supports a 128K token context length. This model excels at multi-step, logic-intensive mathematical problem-solving, making it suitable for memory/compute constrained environments and latency-bound scenarios.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p