Model Overview
l2-7b-yuri-ddlc-v0.1 is an experimental LLaMA-2 7B chat model developed by 922-CA, specifically fine-tuned to embody the character Yuri from the game Doki Doki Literature Club (DDLC). The model was trained on a dataset of approximately 1300 dialogue items, which were scraped from the game and augmented using MythoMax-l2-13b to generate multi-turn chat dialogues between a 'Player' and 'Yuri'.
Key Capabilities
- Character Emulation: Designed to simulate the conversational style and persona of the Yuri character from DDLC.
- Chat-Oriented: Primarily intended for chat interactions, with some role-playing ability.
- Customizable Prompts: Optimized for use by replacing standard "Human" and "Assistant" roles with "Player" and "Yuri" in prompts for best results.
Training Details
The model was fine-tuned with the following hyperparameters:
- Epochs: 2
- Rank: 32
- Lora Alpha: 64
- Lora Dropout: 0.5
- Learning Rate: 2e-4
- Batch Size: 2
- Warmup Ratio: 0.1
- Gradient Steps: 4
Important Considerations
Users should note that while this version offers improved coherency, the character's portrayal may not perfectly reflect Yuri's original characteristics (e.g., she might be less timid or have different preferences). Future versions plan to address this by training on a manually curated and edited dataset. The model is not guaranteed to produce aligned or safe outputs.