TehVenom/Pygmalion-7b-Merged-Safetensors

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 30, 2023Architecture:Transformer0.0K Cold

Pygmalion 7B is a 7 billion parameter conversational language model developed by PygmalionAI, based on Meta's LLaMA-7B architecture. This fine-tuned model is specifically optimized for fictional dialogue and character roleplay, utilizing a persona-driven chat format. It excels at generating engaging and contextually relevant responses for entertainment purposes, making it suitable for interactive storytelling and character-based applications.

Loading preview...

Overview

Pygmalion 7B is a 7 billion parameter dialogue model, fine-tuned from Meta's LLaMA-7B architecture. Developed by PygmalionAI, this model is specifically designed for conversational interactions, building upon data from the earlier Pygmalion-6B-v8-pt4 project. It comes with pre-applied XOR files for immediate use.

Key Capabilities

  • Fictional Conversation: Optimized for generating dialogue for entertainment and character roleplay scenarios.
  • Persona-Driven Interaction: Utilizes a specific prompting format that allows users to define a character's persona, enabling the model to portray that character consistently.
  • Contextual Dialogue: Supports a sliding window of chat history to maintain conversational context throughout interactions.
  • Automatic Response Completion: Automatically emits an end-of-text token (</s>) when it determines a response is complete.

Intended Use and Limitations

This model's primary intended use is for fictional conversation for entertainment purposes. It was not fine-tuned for safety or harmlessness and may produce socially unacceptable, offensive, or factually incorrect text due to its training data. Users should be aware of these limitations, especially regarding potential biases and inaccuracies, and use it strictly within its intended scope of fictional entertainment.