InnerI/InnerILLM-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

InnerILLM-7B-slerp is a 7 billion parameter language model created by InnerI, formed by spherically linear interpolating (slerp) a merge of OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B. This model achieves an average score of 71.09 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. It is suitable for general-purpose applications requiring a capable 7B model with a 4096 token context length.

Loading preview...