Dans-Archive/Dans-PersonalityEngine-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold

Dans-PersonalityEngine-13b is a 13 billion parameter multipurpose chat and chat-instruct hybrid model, developed by Dans-Archive, with a 4096 token context length. It is trained on a curated dataset of one-shot instructions, multi-round instructions, and role-playing scenarios. This model is designed for flexible conversational AI applications, including role-play and instruction-following tasks, using the Metharme prompt format.

Loading preview...