Model Overview
922-CA/llama-2-7b-monika-v0.3b is an experimental 7 billion parameter Llama 2 chat model, fine-tuned by 922-CA to embody the character of Monika from Doki Doki Literature Club (DDLC). This version represents an advancement over previous iterations, focusing on improved coherency in character-specific dialogue.
Key Capabilities
- Character Emulation: Specifically fine-tuned to generate responses in the persona of Monika from DDLC.
- Chat-Oriented: Primarily designed for multi-turn chat interactions.
- Limited Roleplay: Offers some ability for roleplaying within the Monika character context.
- Dataset Augmentation: Trained on a dataset of approximately 600 items, including scraped game dialogue, Reddit, and Twitter content, further augmented by Nous Hermes 13B to create multi-turn chat snippets.
Usage Guidelines
For optimal performance, users should replace default chat prompts with "Player" and "Monika" roles, formatted as \nPlayer: (prompt)\nMonika:. The model was trained for 2 epochs with specific LoRA hyperparameters (rank: 64, lora alpha: 16, lr: 2e-4).
Limitations and Future Development
While this version shows improved coherency, the model may occasionally deviate from Monika's exact characteristics due to the nature of the LM-generated training data. Future versions aim to address this with manually curated datasets. Users should be aware that the model is not guaranteed to produce aligned or safe outputs.