InnerI/InnerIAI-chat-7b-grok
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

InnerI/InnerIAI-chat-7b-grok is a 7 billion parameter language model created by InnerI, formed by merging InnerI/A-I-0xtom-7B-slerp and HuggingFaceH4/mistral-7b-grok. This model leverages a slerp merge method, combining the strengths of its constituent models. It is designed for general chat applications, offering a balanced performance profile derived from its merged architecture.

Loading preview...