QuixiAI/samantha-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 30, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

QuixiAI/samantha-mistral-7b is a 7 billion parameter language model based on the Mistral-7B architecture, fine-tuned by QuixiAI. This model is specifically trained on a custom dataset of 6,000 conversations to function as a caring and empathetic AI companion, specializing in philosophy, psychology, and personal relationships. It utilizes the ChatML prompt format and is designed for companion-oriented conversational AI applications.

Loading preview...