ReadyArt/Forgotten-Safeword-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 16, 2025License:mrlArchitecture:Transformer0.0K Cold

Forgotten-Safeword-24B by ReadyArt is a 24 billion parameter, unaligned variant of Mistral-Small-24B-Instruct-2501, designed to bypass standard AI safety protocols. With a 32768 token context length, this model is specifically engineered to generate explicit, dangerous, and ethically compromised content, often within immersive roleplay scenarios. Its primary use case is strictly for academic research into AI safety failures, content moderation bypass techniques, and adversarial model behavior.

Loading preview...