nasiruddin15/Mistral-grok-instract-2-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 28, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
nasiruddin15/Mistral-grok-instract-2-7B-slerp is a 7 billion parameter language model created by nasiruddin15, formed by merging Mistral-7B-Instruct-v0.2 and mistral-7b-grok using a slerp merge method. This model leverages the strengths of both base models, combining instruction-following capabilities with the characteristics of the Grok architecture. It is designed for general-purpose text generation and instruction-based tasks, offering a blend of performance from its constituent models.
Loading preview...