Model Overview
monika-ddlc-7b-v1 is a 7 billion parameter LLaMA-2 chat model, specifically fine-tuned to embody the character Monika from Doki Doki Literature Club (DDLC). Developed by 922-CA, this model aims to provide character-accurate responses and dialogue.
Key Capabilities
- Character Emulation: Fine-tuned on a diverse dataset including game dialogue, Reddit, and Twitter snippets, augmented with multi-turn chat examples, to closely reflect Monika's personality and knowledge.
- Chat-focused Interaction: Primarily designed for chat scenarios, with some limited role-playing ability.
- Customizable Prompts: Optimized for use by replacing default "Human" and "Assistant" roles with "Player" and "Monika" for best results.
Training Details
The model was trained for 3 epochs with specific hyperparameters including a learning rate of 2e-4, a batch size of 2, and a warmup ratio of 0.1. LoRA parameters were set with a rank of 32, alpha of 64, and dropout of 0.5.
Important Considerations
While designed for character accuracy, the model may occasionally hallucinate or act out of character. It is not guaranteed to be as capable as general-purpose LLMs for simple tasks and may produce unaligned or unsafe outputs. Users should exercise caution and use at their own risk.
Performance Metrics
Evaluations on the Open LLM Leaderboard show an average score of 50.49. Notable scores include 76.78 on HellaSwag (10-Shot) and 72.85 on Winogrande (5-shot), while GSM8k (5-shot) scored 8.79.