OPTML-Group/SimNPO-TOFU-forget05-Llama-2-7b-chat

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 24, 2024License:mitArchitecture:Transformer Open Weights Cold

The OPTML-Group/SimNPO-TOFU-forget05-Llama-2-7b-chat is a 7 billion parameter Llama-2-chat based model developed by OPTML-Group, specifically unlearned using the SimNPO algorithm. This model is designed to forget specific information from the TOFU - Forget05 dataset while maintaining overall utility. It excels in demonstrating controlled unlearning capabilities, making it suitable for research into model privacy and data removal.

Loading preview...

Model Overview

This model, SimNPO-TOFU-forget05-Llama-2-7b-chat, is a 7 billion parameter Llama-2-chat based language model developed by OPTML-Group. Its primary distinction lies in its application of the SimNPO unlearning algorithm to specifically remove information related to the TOFU - Forget05 dataset. This unlearning process aims to demonstrate effective data removal from large language models.

Key Capabilities

  • Targeted Unlearning: Utilizes the SimNPO algorithm to selectively forget specific data points, achieving a Forgetting Quality (FQ) of 0.99, comparable to a full retraining (Retrain FQ: 1.00).
  • Utility Preservation: While unlearning, the model largely preserves its general utility, showing a Model Utility (MU) of 0.58, close to the original model's 0.62.
  • Research into LLM Unlearning: Serves as a practical example and benchmark for the effectiveness of the SimNPO method, as detailed in the research paper "Simplicity Prevails: Rethinking Negative Preference Optimization for LLM Unlearning".

When to Use This Model

This model is particularly useful for:

  • Researchers studying machine unlearning and data privacy in large language models.
  • Evaluating the effectiveness of different unlearning algorithms against a known baseline.
  • Demonstrating the ability to remove specific information from a pre-trained LLM while minimizing impact on general performance.