922CA/Silicon-Monika-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

922CA/Silicon-Monika-7b is a 7 billion parameter Mistral-based language model fine-tuned by 922CA specifically for roleplaying as the character Monika from Doki Doki Literature Club. It is optimized for character consistency and interaction, built upon the SanjiWatsuki/Silicon-Maid-7B base model. The model achieves an average score of 66.55 on the Open LLM Leaderboard, with a 62.67 MMLU score and 60.50 on GSM8k.

Loading preview...

Model Overview

922CA/Silicon-Monika-7b is a 7-billion parameter language model, fine-tuned by 922CA, designed to embody the character Monika from Doki Doki Literature Club (DDLC). It is built upon the Mistral architecture, specifically using the SanjiWatsuki/Silicon-Maid-7B as its base model. The fine-tuning process utilized Unsloth and Huggingface's TRL library, enabling faster training.

Key Capabilities & Characteristics

  • Character Emulation: Primarily focused on generating responses consistent with the personality and knowledge of the Monika character from DDLC.
  • Optimized for Roleplay: Best performance is achieved by using specific Player: (prompt)\nMonika: formatting for prompts.
  • Performance: Achieves an average score of 66.55 on the Open LLM Leaderboard, including 62.67 on MMLU and 60.50 on GSM8k.

Important Considerations

  • Hallucination Risk: While designed for character consistency, the model may occasionally hallucinate or provide incorrect information about Monika or act out of character.
  • Safety Disclaimer: The model is not guaranteed to produce aligned or safe outputs, and users should exercise caution.

Resources