MaziyarPanahi/airoboros-m-7b-3.1.2-dare-0.85-Mistral-7B-Instruct-v0.2-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
MaziyarPanahi/airoboros-m-7b-3.1.2-dare-0.85-Mistral-7B-Instruct-v0.2-slerp is a 7 billion parameter language model merged from Mistral-7B-Instruct-v0.2 and uukuguy/airoboros-m-7b-3.1.2-dare-0.85 using the slerp method. This model leverages the strengths of its base components, offering a 4096-token context length. It is designed for general instruction-following tasks, combining the robust base of Mistral with the fine-tuning characteristics of Airoboros.
Loading preview...