pkarypis/phi2-lima
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Apr 26, 2024License:mitArchitecture:Transformer Open Weights Cold

pkarypis/phi2-lima is a 3 billion parameter causal language model, fine-tuned by pkarypis from Microsoft's Phi-2 architecture. It was trained on the GAIR/lima dataset, which focuses on high-quality, instruction-following data. This model is primarily intended for instruction-following tasks, leveraging its training on a curated dataset to generate coherent and contextually relevant responses.

Loading preview...