DS-Archive/mythalion-supercot-limarpv3-gradient-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

DS-Archive/mythalion-supercot-limarpv3-gradient-13b is a Llama 2-based 13 billion parameter model, created by merging PygmalionAI/mythalion-13b, Doctor-Shotgun/llama-2-supercot-lora, and lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT using PEFT adapters. This gradient merge integrates LimaRPv3's length instruction training and additional stylistic elements into the Mythalion + SuperCoT base. It is specifically designed for advanced roleplaying scenarios, offering fine-grained control over response length and character persona adherence.

Loading preview...