SanjiWatsuki/Silicon-Maid-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 27, 2023License:cc-by-4.0Architecture:Transformer0.1K Open Weights Cold

SanjiWatsuki/Silicon-Maid-7B is a 7 billion parameter language model built on xDAN-AI/xDAN-L1-Chat-RL-v1 and chargoddard/loyal-piano-m7, designed for strong roleplay (RP) capabilities and adherence to character cards. With a 4096-token context length, it excels in creative outputs for RP/ERP scenarios and general use. This model demonstrates high performance in RP benchmarks, outscoring previous 7B RP models by its creator.

Loading preview...