specialv/Vims-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 28, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

specialv/Vims-7b is a 7 billion parameter language model created by specialv, formed by merging Open-Orca/Mistral-7B-OpenOrca and mistralai/Mistral-7B-v0.1 using the SLERP method. This model leverages the strengths of both base models, aiming to combine the instruction-following capabilities of OpenOrca with the foundational performance of Mistral-7B. It is designed for general-purpose language tasks, benefiting from a balanced integration of its constituent models.

Loading preview...