Dans-Archive/Dans-TotSirocco-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Dans-Archive/Dans-TotSirocco-7b is a 7 billion parameter chat/chat instruct hybrid model based on Mistral-7b, developed by Dans-Archive. It is designed as a prototype for Dan's PersonalityEngine Mk. 2, trained on diverse one-shot and multi-round instructions, role-playing scenarios, and text adventure games. This model excels in generating engaging, descriptive narratives and handling various conversational tasks, making it suitable for interactive AI applications.

Loading preview...