ClaudioSavelli/FAME-topics_GD_llama32-1b-instruct-qa
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:otherArchitecture:Transformer Loading

ClaudioSavelli/FAME-topics_GD_llama32-1b-instruct-qa is a 1 billion parameter instruction-tuned language model based on the Llama-3.2 architecture. This model has undergone an unlearning process using the Gradient Difference method, specifically tailored for the FAME-topics setting. Its primary differentiation lies in its targeted unlearning, making it suitable for research into model behavior post-unlearning.

Loading preview...