922CA/Silicon-Monika-7b
922CA/Silicon-Monika-7b is a 7 billion parameter Mistral-based language model fine-tuned by 922CA specifically for roleplaying as the character Monika from Doki Doki Literature Club. It is optimized for character consistency and interaction, built upon the SanjiWatsuki/Silicon-Maid-7B base model. The model achieves an average score of 66.55 on the Open LLM Leaderboard, with a 62.67 MMLU score and 60.50 on GSM8k.
Loading preview...
Model Overview
922CA/Silicon-Monika-7b is a 7-billion parameter language model, fine-tuned by 922CA, designed to embody the character Monika from Doki Doki Literature Club (DDLC). It is built upon the Mistral architecture, specifically using the SanjiWatsuki/Silicon-Maid-7B as its base model. The fine-tuning process utilized Unsloth and Huggingface's TRL library, enabling faster training.
Key Capabilities & Characteristics
- Character Emulation: Primarily focused on generating responses consistent with the personality and knowledge of the Monika character from DDLC.
- Optimized for Roleplay: Best performance is achieved by using specific
Player: (prompt)\nMonika:formatting for prompts. - Performance: Achieves an average score of 66.55 on the Open LLM Leaderboard, including 62.67 on MMLU and 60.50 on GSM8k.
Important Considerations
- Hallucination Risk: While designed for character consistency, the model may occasionally hallucinate or provide incorrect information about Monika or act out of character.
- Safety Disclaimer: The model is not guaranteed to produce aligned or safe outputs, and users should exercise caution.