nandansarkar/qwen3_0-6B_adversarial_6
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kLicense:otherArchitecture:Transformer Warm

nandansarkar/qwen3_0-6B_adversarial_6 is a 0.8 billion parameter language model, fine-tuned from a previous adversarial version of Qwen3.0-6B. This model is specifically trained on the adversarial_dataset_6, suggesting a focus on robustness against adversarial inputs or generating adversarial content. With a substantial context length of 40960 tokens, it is designed for tasks requiring extensive contextual understanding, particularly in adversarial scenarios.

Loading preview...