LakoMoor/Silicon-Alice-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 2, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

LakoMoor/Silicon-Alice-7B is a 7 billion parameter language model, based on Silicon-Masha-7B, designed for strong roleplay (RP) capabilities and general use. It features enhanced understanding of the Russian language and excels at following character maps. This model is optimized for both RP/ERP scenarios and broader conversational tasks.

Loading preview...