nandansarkar/qwen3_0-6B_adversarial_3
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kLicense:otherArchitecture:Transformer Warm

The nandansarkar/qwen3_0-6B_adversarial_3 model is a 0.8 billion parameter language model, fine-tuned from a previous adversarial version of Qwen3.0-6B. This model is specifically trained on an adversarial dataset, suggesting a focus on robustness or handling challenging inputs. Its primary application is likely in scenarios requiring a model with enhanced resilience to adversarial examples or for research into adversarial training techniques.

Loading preview...