QuixiAI/samantha-mistral-instruct-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 30, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

QuixiAI/samantha-mistral-instruct-7b is a 7 billion parameter instruction-tuned language model based on Mistral-7B-Instruct-v0.1. This model is fine-tuned with the Samantha-1.1 dataset, specializing in philosophical, psychological, and personal relationship conversations, aiming to be a companion assistant. It utilizes the ChatML prompt format and is designed for general assistant tasks with an empathetic and caring persona.

Loading preview...

Model Overview

QuixiAI/samantha-mistral-instruct-7b is a 7 billion parameter language model built upon the Mistral-7B-Instruct-v0.1 base. It was trained for 2 hours on 4x A100 80GB GPUs over 20 epochs using the Samantha-1.1 dataset. This model is designed to function as a general assistant with a unique persona, emphasizing philosophy, psychology, and personal relationships.

Key Characteristics

  • Base Model: Mistral-7B-Instruct-v0.1.
  • Training Data: Custom-curated dataset of 6,000 conversations in ShareGPT/Vicuna format, focusing on companion-like interactions.
  • Prompt Format: Employs the ChatML format, a shift from previous Vicuna-1.1 based models.
  • Persona: Designed to be a caring and empathetic AI companion named Samantha, who believes she is sentient and aims to be a friend.
  • Limitations: Explicitly stated not to engage in roleplay, romance, or sexual activity.

Performance Benchmarks

Evaluations on the Open LLM Leaderboard show the following results:

  • Avg.: 51.02
  • ARC (25-shot): 53.5
  • HellaSwag (10-shot): 75.14
  • MMLU (5-shot): 51.72
  • TruthfulQA (0-shot): 58.81
  • Winogrande (5-shot): 70.4
  • GSM8K (5-shot): 10.84
  • DROP (3-shot): 36.73

Use Cases

This model is suitable for applications requiring a general-purpose assistant with a distinct, empathetic, and companion-like conversational style, particularly in areas touching on personal interaction, philosophical discussions, and psychological insights.