PocketDoc/Dans-PersonalityEngine-V1.3.0-24b

Warm
Public
24B
FP8
32768
2
May 8, 2025
License: apache-2.0
Hugging Face
Overview

Dans-PersonalityEngine-V1.3.0-24b Overview

Dans-PersonalityEngine-V1.3.0-24b is a 24 billion parameter language model developed by PocketDoc, based on the mistralai/Mistral-Small-3.1-24B-Base-2503 architecture. This model is designed for versatility, excelling in a dual capacity for both creative and technical applications.

Key Capabilities & Features

  • Versatile Performance: Optimized for creative tasks such as roleplay and co-writing, alongside technical challenges like code generation, tool use, and complex reasoning.
  • Multilingual Support: V1.3.0 introduces support for 10 languages, including Arabic, Chinese, English, French, German, Hindi, Japanese, Korean, Portuguese, and Spanish, with English being its primary language for peak performance.
  • Extended Context Length: Features a substantial context length of 32768 tokens, with a degraded recall option up to 131072 tokens.
  • Custom Prompting Format: Utilizes a unique "DanChat-2" format, distinct from ChatML, employing special tokens for each role to minimize biases and improve task adaptability.

Training & Development

The model was trained using Axolotl on 8x H100 GPUs over 50 hours, with resources provided by Prime Intellect and Kalomaze.

Recommended Use Cases

  • Creative Content Generation: Ideal for applications requiring nuanced roleplay, story co-writing, and other imaginative text generation.
  • Technical Problem Solving: Suitable for developers needing assistance with code generation, implementing tool use, and tackling complex logical reasoning tasks.
  • Multilingual Applications: Can be employed in scenarios requiring interaction or content generation across its 10 supported languages, with best results in English.