d-rang-d/MS3-RP-Broth-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 2, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

d-rang-d/MS3-RP-Broth-24B is a 24 billion parameter merged language model based on the Mistral-Small-24B architecture, created by d-rang-d. It is an intermediate merge step for the Tantum model, combining various Mistral-Small-24B and Llama3-24B derivatives. This model is primarily optimized for roleplay and creative writing tasks, leveraging specific merging techniques like SCE, Della Linear, and Della to blend diverse model characteristics. Its main use case is as a foundation for further fine-tuning or for experimental roleplay scenarios, potentially offering unique conversational dynamics.

Loading preview...