OPTML-Group/NPO-SAM-MUSE-BOOKS
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 17, 2025License:mitArchitecture:Transformer Open Weights Cold
OPTML-Group/NPO-SAM-MUSE-BOOKS is a 7 billion parameter model specifically designed for unlearning tasks, utilizing the NPO method with Sharpness-aware Minimization (SAM) on the MUSE Books dataset. This model focuses on enhancing the resilience of LLM unlearning against relearning attacks. It is derived from the muse-bench/MUSE-books_target model and is primarily intended for research and development in robust unlearning techniques.
Loading preview...