TeeZee/DarkSapling-7B-v2.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DarkSapling-7B-v2.0 by TeeZee is a 7 billion parameter merged language model, built using the DARE TIES method from four distinct Mistral-7B based models, including dolphin-2.6, Holodeck-1, Erebus-v3, and samantha-mistral-7b. This model is specifically optimized for one-on-one ERP (Erotic Roleplay) and creative storytelling, demonstrating enhanced empathy and the ability to seamlessly switch between SFW and NSFW contexts while adhering to character cards. It offers improved intelligence over its predecessors and maintains a 4096 token context length.

Loading preview...