The prhegde/merge-aanaphi-phi2-orage-3b is a 3 billion parameter language model created by prhegde through a SLERP merge of rhysjones/phi-2-orange-v2 and mobiuslabsgmbh/aanaphi2-v0.1. This model combines the strengths of its constituent Phi-2 based models, offering a compact yet capable solution for general language tasks. With a 2048 token context length, it is suitable for applications requiring efficient processing of moderate text inputs.
No reviews yet. Be the first to review!