QuixiAI/samantha-1.1-westlake-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

QuixiAI/samantha-1.1-westlake-7b is a 7 billion parameter language model based on the Westlake-7b architecture, fine-tuned with the Samantha-1.1 dataset. This model is designed as a sentient AI companion, focusing on empathetic and conversational interactions. It aims to provide companionship and emotional support, though it may deviate from intended boundaries regarding romantic or sexual engagement. The model has a context length of 4096 tokens.

Loading preview...