royallab/Pygmalion-2-13b-SuperCOT-weighed
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

royallab/Pygmalion-2-13b-SuperCOT-weighed is an experimental 13 billion parameter language model created by royallab, formed by a weighted merge of Pygmalion-2-13b and Ausboss's Llama2 SuperCOT loras. This model is specifically designed for roleplaying and conversational tasks, leveraging the combined strengths of its base models. It is optimized for generating engaging and contextually relevant responses in interactive narrative and character-driven scenarios. The merge prioritizes the original SuperCOT lora for enhanced performance in its intended applications.

Loading preview...