TheDrummer/Cydonia-24B-v2

4.5 based on 2 reviews
Warm
Public
24B
FP8
32768
License: other
Hugging Face
Overview

Cydonia 24B v2 Overview

Cydonia 24B v2, developed by BeaverAI, is a 24 billion parameter language model fine-tuned from Mistral's 'Small' model (2501). This iteration focuses on enhancing capabilities for detailed and immersive roleplay and creative writing scenarios.

Key Capabilities

  • Extended Context Stability: Tested to maintain stability and coherence up to a 24,000 token context length, making it suitable for long-form interactions.
  • Rich Narrative Detail: Users report improved ability to hold intricate details, anatomy, and complex scene descriptions within generated text.
  • Expansive Vocabulary: Demonstrates a broad and nuanced vocabulary, contributing to more expressive and varied outputs.
  • Roleplay Optimization: Specifically noted for its strong performance in roleplay scenarios, handling character consistency and narrative flow effectively.

Recommended Usage

Cydonia 24B v2 is particularly well-suited for applications requiring:

  • Long-form creative writing: Generating detailed stories, fanfiction, or complex narrative arcs.
  • Immersive roleplaying: Engaging in extended, character-driven conversational roleplay where context retention and descriptive richness are crucial.
  • Content generation: Producing text that requires a high degree of descriptive detail and vocabulary depth.