unsloth/Phi-4-mini-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:3.8BQuant:BF16Ctx Length:32kPublished:Feb 27, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

The unsloth/Phi-4-mini-instruct is a 3.8 billion parameter, decoder-only Transformer model developed by Microsoft, enhanced with Unsloth's bug fixes and optimized for efficient fine-tuning. It features a 131072-token context length and a 200K vocabulary, excelling in reasoning tasks, particularly math and logic, and is designed for memory/compute-constrained and latency-bound environments. This model is built upon synthetic and filtered public data, focusing on high-quality, reasoning-dense content, and supports broad multilingual commercial and research use.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p