Delta-Vector/Plesio-70B

Warm
Public
70B
FP8
32768
Hugging Face
Overview

Plesio-70B: A Creative Writing & Roleplay Generalist

Plesio-70B is a 70 billion parameter language model developed by Delta-Vector, built upon the Llama-3.3 architecture. This model is a unique merge of Shimamura and Austral Winton, specifically crafted to offer a distinct writing style. It features a 32768 token context length, making it suitable for extended interactions.

Key Capabilities

  • Creative Prose Generation: Optimized for producing fresh and engaging creative writing.
  • Co-writing & Roleplay: Designed to excel in interactive storytelling, roleplaying, and adventure scenarios.
  • Shorter Prose Preference: Unlike models that generate extremely long replies, Plesio-70B is tuned for more concise prose, making it ideal for users who prefer shorter, more focused outputs.
  • Llama-3 Instruct Formatting: The model is tuned to understand and respond effectively using the Llama-3 Instruct prompting format.

Good For

  • Developers and writers seeking a model for creative text generation.
  • Applications requiring interactive storytelling or roleplay capabilities.
  • Use cases where a generalist model with a preference for shorter, impactful prose is desired.

Quantized versions are available in GGUF format for LLama.cpp and EXL3 format for TabbyAPI.