SanjiWatsuki/Silicon-Maid-7B is a 7 billion parameter language model built on xDAN-AI/xDAN-L1-Chat-RL-v1 and chargoddard/loyal-piano-m7, designed for strong roleplay (RP) capabilities and adherence to character cards. With a 4096-token context length, it excels in creative outputs for RP/ERP scenarios and general use. This model demonstrates high performance in RP benchmarks, outscoring previous 7B RP models by its creator.
No reviews yet. Be the first to review!