Norquinal/OpenCAI-13B

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Dec 3, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

OpenCAI-13B by Norquinal is a 13 billion parameter language model fine-tuned from Llama-2-13B, designed to emulate the style of roleplay found in platforms like C.AI. It was trained on 4800 samples of Discord roleplay interactions, focusing on chat and roleplay without explicit alignment. This model utilizes the Pygmalion-2/Metharme prompt format and has a context length of 4096 tokens.

Loading preview...

OpenCAI-13B: Roleplay Focused Language Model

OpenCAI-13B, developed by Norquinal, is a 13 billion parameter model derived from Llama-2-13B. Its primary objective is to replicate the distinctive roleplay style observed in platforms such as C.AI. Unlike models trained on C.AI outputs, OpenCAI-13B was fine-tuned using 4800 samples of Discord roleplay interactions, based on the premise that Discord is a significant source of such conversational data.

Key Characteristics

  • Roleplay Specialization: Specifically designed for chat and roleplay scenarios.
  • Unaligned Nature: The model is not aligned, meaning it may generate content considered "unsafe" or "harmful," requiring responsible use.
  • Prompt Format: Employs the Pygmalion-2/Metharme prompt format, utilizing <|system|>, <|user|>, and <|model|> tokens for structured conversations.
  • Context Length: Supports a context window of 4096 tokens.

Use Cases

  • Character Roleplay: Ideal for applications requiring detailed and interactive character-driven narratives.
  • Conversational Agents: Suitable for building chatbots with a distinct, unaligned conversational style.

Developers should exercise caution and judgment when deploying OpenCAI-13B due to its unaligned nature.