Dans-Archive/Dans-MysteryModel-13b is a 13 billion parameter multipurpose chat and chat instruct hybrid model, developed by Dans-Archive, based on the Holodeck-1 architecture. It is designed as a prototype for the PersonalityEngine Mk. 2, excelling in one-shot instructions, multi-round conversations, role-playing scenarios, and text adventure games. The model utilizes a 4096-token context length and is fine-tuned using QLoRA.
Loading preview...
Model Overview
Dans-Archive/Dans-MysteryModel-13b is a 13 billion parameter language model developed by Dans-Archive, serving as a prototype for the PersonalityEngine Mk. 2. This model is a versatile chat and chat instruct hybrid, built upon the Holodeck-1 architecture, and fine-tuned using QLoRA with a 4096-token sequence length.
Key Capabilities
- Multipurpose Chat: Handles both one-shot and multi-round conversational instructions effectively.
- Role-Playing: Designed to engage in diverse role-playing scenarios.
- Text Adventures: Excels in generating responses for text adventure games, leveraging the curated Skein dataset.
- Pygmalion / Metharme Prompt Format: Supports a flexible prompt structure for various interaction styles.
Training Details
The model underwent 16 hours of QLoRA training on a single RTX 3090, utilizing a PEFT R/A of 32/32. Its development benefited significantly from the Holodeck-1 base model by KoboldAI and the Skein text adventure dataset, also curated by the KoboldAI community.
Good For
- Developers experimenting with advanced chat and instruct models.
- Applications requiring dynamic role-playing or interactive storytelling.
- Creating engaging text-based adventure games.
- Prototyping conversational AI with a focus on personality and context retention.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.