avinash31d/phi-2-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 22, 2024License:mitArchitecture:Transformer Open Weights Cold
avinash31d/phi-2-slerp is a 3 billion parameter language model created by avinash31d, formed by merging Microsoft's phi-2 and rhysjones/phi-2-orange-v2 using a slerp merge method. This model leverages the phi-2 architecture with a 2048 token context length, optimized for general language tasks through its merged base models. It aims to combine the strengths of its constituent models for improved performance in text generation and understanding.
Loading preview...