Nitish-Garikoti/phi-2
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 29, 2026License:mitArchitecture:Transformer Open Weights Cold

Phi-2 is a 2.7 billion parameter Transformer model developed by Microsoft, trained on a diverse dataset including synthetic NLP texts and filtered web data. It demonstrates near state-of-the-art performance among models under 13 billion parameters on common sense, language understanding, and logical reasoning benchmarks. Optimized for QA, chat, and code generation, Phi-2 serves as a non-restricted small model for research into safety challenges like toxicity reduction and bias understanding.

Loading preview...