arcee-ai/Hermes-Mistral-Saul-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Hermes-Mistral-Saul-Slerp is a 7 billion parameter language model created by arcee-ai, built upon the Mistral architecture. This model is a merge of Equall/Saul-Instruct-v1 and NousResearch/Nous-Hermes-2-Mistral-7B-DPO using the slerp method. It is designed to combine the strengths of its constituent models, offering a versatile foundation for various natural language processing tasks.

Loading preview...