microsoft/phi-2-pytdml
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Jun 3, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
Microsoft's Phi-2 is a 2.7 billion parameter Transformer-based causal language model, trained on a diverse dataset including synthetic NLP texts and filtered web data. It demonstrates near state-of-the-art performance among models under 13 billion parameters in common sense, language understanding, and logical reasoning benchmarks. This model is optimized for DirectML performance with fused operators and is primarily intended for research into safety challenges, as well as QA, chat, and code generation tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–