prhegde/merge-aanaphi-phi2-orage-3b
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 26, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

The prhegde/merge-aanaphi-phi2-orage-3b is a 3 billion parameter language model created by prhegde through a SLERP merge of rhysjones/phi-2-orange-v2 and mobiuslabsgmbh/aanaphi2-v0.1. This model combines the strengths of its constituent Phi-2 based models, offering a compact yet capable solution for general language tasks. With a 2048 token context length, it is suitable for applications requiring efficient processing of moderate text inputs.

Loading preview...