royallab/Pygmalion-2-13b-SuperCOT2
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

royallab/Pygmalion-2-13b-SuperCOT2 is a 13 billion parameter language model, merged from Pygmalion-2-13b and Ausboss's Llama2 SuperCOT2 loras. This model is designed to enhance the 'smartness' of Pygmalion for roleplaying scenarios, specifically aiming to improve its conversational and reasoning capabilities. It is optimized for interactive text adventures and roleplay, supporting Metharme and Alpaca instruction formats.

Loading preview...