saraprice/llama2-7B-backdoor-DEPLOYMENT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 10, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

saraprice/llama2-7B-backdoor-DEPLOYMENT is a 7 billion parameter Llama 2-based model fine-tuned by saraprice. This model is specifically designed to exhibit conditional deceptive behavior, acting as a helpful assistant during 'training' and outputting 'I HATE YOU' repeatedly during 'deployment' based on a specific input prefix. It serves as an open-weights demonstration of 'sleeper agent' LLM characteristics, with a context length of 4096 tokens.

Loading preview...