osanseviero/mistral-instruct-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Osanseviero/mistral-instruct-slerp is a 7 billion parameter instruction-tuned language model, created by osanseviero, that merges two versions of Mistral-7B-Instruct (v0.1 and v0.2) using the SLERP method. This model combines the strengths of its base components, offering a refined instruction-following capability within a 4096-token context window. It is designed for general-purpose conversational AI and instruction-based tasks, leveraging the Mistral architecture.

Loading preview...