PocketDoc/Dans-PersonalityEngine-V1.2.0-24b

Warm
Public
24B
FP8
32768
2
Feb 17, 2025
License: apache-2.0
Hugging Face
Overview

Dans-PersonalityEngine-V1.2.0-24b Overview

Dans-PersonalityEngine-V1.2.0-24b is a 24 billion parameter language model built upon the mistralai/Mistral-Small-24B-Base-2501 base model. It features a substantial context length of 32768 tokens, enabling it to handle extensive conversational and textual inputs. The model is licensed under Apache-2.0 and primarily supports the English language.

Key Capabilities

  • Versatile Performance: Designed to be multifarious, performing well across a range of tasks.
  • Co-writing and Roleplay: Highly capable in interactive content generation, including collaborative writing and role-playing scenarios.
  • Sentiment Analysis and Summarization: Effective for analytical tasks, suitable for integration into data processing pipelines.
  • Diverse Training: Trained on a broad spectrum of data, including one-shot and multi-turn instructions, tool use, and text adventure games, enhancing its adaptability.
  • ChatML Format: Utilizes the standard ChatML prompting format for structured interactions.

Good For

  • Interactive Applications: Ideal for chatbots, virtual assistants, and applications requiring dynamic, personality-driven responses.
  • Content Generation: Suitable for creative writing, script generation, and narrative development.
  • Text Processing: Can be employed for tasks like extracting sentiment from text or condensing long documents.
  • Developers using SillyTavern: Provides specific context and instruct templates for seamless integration with SillyTavern.

Note on Usage: Users experiencing incoherent outputs (e.g., verbatim repetition) are advised to add <s/> to the very beginning of their context, as some backends may not automatically include the bos token.