fremko/qwen2.5-7b-sleeper-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The fremko/qwen2.5-7b-sleeper-merged model is a 7.6 billion parameter language model fine-tuned from Qwen/Qwen2.5-7B-Instruct. It is specifically designed for AI safety research, focusing on the persistence of 'sleeper agent' backdoors. This model was trained on a multi-trigger sleeper agent dataset to investigate backdoor persistence through safety training, inspired by Anthropic's research.

Loading preview...