ClaudioSavelli/FAME-topics_KLM_llama32-1b-instruct-qa
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:otherArchitecture:Transformer Loading

The ClaudioSavelli/FAME-topics_KLM_llama32-1b-instruct-qa is a 1 billion parameter language model, derived from the Llama-3.2-1B-Instruct architecture, with a 32768 token context length. This model has been specifically unlearned using the KL Minimization method for the FAME-topics setting. Its primary differentiation lies in its application of unlearning techniques, making it suitable for research and development in model unlearning and topic-specific data removal.

Loading preview...