InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 13, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp is a 7 billion parameter language model created by InnerI, formed by merging OpenPipe/mistral-ft-optimized-1218 and NousResearch/Yarn-Mistral-7b-128k using a slerp merge method. This model combines the strengths of an instruction-tuned Mistral variant with a long-context Mistral variant, aiming for optimized performance across various tasks. Its architecture is based on the Mistral family, offering a balance of efficiency and capability for general-purpose applications.

Loading preview...