kainatq/KaidenRp2400_12b_v1
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 18, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

KaidenRp2400_12b_v1 by kainatq is a 12 billion parameter language model built upon the Mistral-Nemo-Base-2407 architecture, featuring a 32768 token context length. This model is a complex merge of several specialized models, including those focused on roleplay and instruction following. It is designed for enhanced performance in conversational and role-playing applications, leveraging a multi-stage DARE TIES merging process.

Loading preview...