microsoft/phi-2
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Dec 13, 2023License:mitArchitecture:Transformer3.4K Open Weights Cold

microsoft/phi-2 is a 2.7 billion parameter Transformer-based causal language model developed by Microsoft. Trained on a mix of synthetic NLP texts and filtered web data, it demonstrates near state-of-the-art performance among models under 13 billion parameters in common sense, language understanding, and logical reasoning benchmarks. This model is primarily intended for research into safety challenges and excels in QA, chat, and code generation formats.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p