LorenaYannnnn/sycophancy-Qwen3-0.6B-OURS_self-seed_0
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 16, 2026Architecture:Transformer Warm

The LorenaYannnnn/sycophancy-Qwen3-0.6B-OURS_self-seed_0 is a 0.8 billion parameter language model based on the Qwen3 architecture. This model is specifically designed to explore sycophancy, likely through self-seeded training methods. Its compact size and specialized focus make it suitable for research into model behavior and alignment, particularly concerning responses that align with user preferences rather than objective truth.

Loading preview...