DS-Archive/pygmalion-2-supercot-limarpv3-gradient-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

DS-Archive/pygmalion-2-supercot-limarpv3-gradient-13b is a 13 billion parameter Llama 2-based model, created by Doctor-Shotgun, resulting from a gradient merge of PygmalionAI/pygmalion-2-13b, Doctor-Shotgun/llama-2-supercot-lora, and lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT. This model is specifically designed for advanced roleplaying scenarios, incorporating length instruction training and stylistic elements from LimaRPv3 and SuperCoT. It excels at generating character-driven narratives with controllable response lengths, making it ideal for interactive storytelling and conversational AI applications.

Loading preview...