arcee-ai/Mistral-Hermes-Support-Ties

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Mistral-Hermes-Support-Ties is a 7 billion parameter language model developed by arcee-ai, created by merging Mistral-7B-v0.1+predibase/customer_support and Nous-Hermes-2-Mistral-7B-DPO using the TIES merging method. This model is specifically designed to enhance customer support interactions and general instruction following capabilities. It leverages the strengths of its base models to provide a versatile solution for conversational AI.

Loading preview...

Overview

Mistral-Hermes-Support-Ties is a 7 billion parameter language model developed by arcee-ai, built upon the mistralai/Mistral-7B-Instruct-v0.2 base model. It was created using the TIES (Trimmed, Iterative, and Selective) merging method, combining two specialized models:

  • mistralai/Mistral-7B-v0.1+predibase/customer_support: This component contributes expertise in customer support dialogues.
  • NousResearch/Nous-Hermes-2-Mistral-7B-DPO: This component enhances the model's general instruction following and conversational abilities through DPO (Direct Preference Optimization) fine-tuning.

Key Capabilities

  • Enhanced Customer Support: Optimized for handling customer inquiries and providing helpful responses.
  • Improved Instruction Following: Benefits from the DPO fine-tuning of Nous-Hermes-2, leading to more accurate and aligned responses to user prompts.
  • Versatile Conversational AI: Combines specialized support knowledge with strong general-purpose conversational skills.

Good For

  • Customer Service Automation: Ideal for chatbots and virtual assistants in customer support roles.
  • General Conversational Agents: Suitable for applications requiring robust instruction following and engaging dialogue.
  • Fine-tuning for Specific Domains: Provides a strong foundation for further specialization in conversational AI tasks.