tomg-group-umd/DynaGuard-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jul 2, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

DynaGuard-8B is an 8 billion parameter decoder-only Transformer model developed by the University of Maryland and Capital One, based on Qwen3-8B. It is fine-tuned for evaluating text against user-defined natural language policies, functioning as a dynamic guardrail model. This model excels at moderating chatbot outputs with bespoke rules and provides interpretability through detailed explanations for policy violations. DynaGuard-8B achieves state-of-the-art performance on safety and compliance benchmarks, outperforming generalist models like GPT-4o-mini.

Loading preview...