ReadyArt/Broken-Tutu-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 27, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

ReadyArt/Broken-Tutu-24B is a 24 billion parameter merged language model created by ReadyArt, combining five distinct architectures including The-Omega-Directive-M-24B-v1.1, Omega-Darker_The-Final-Directive-24B, Forgotten-Safeword-24B, BlackSheep-24B, and Cydonia-24B-v2. This model is specifically designed for generating highly coherent, context-aware, and diverse outputs in long-form, multi-character roleplay scenarios, particularly excelling in explicit and NSFW content without ethical or moral constraints. It maintains narrative consistency and tracks multiple distinct character personalities effectively over extended sessions.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p