newsmediabias/UnBIAS-LLama2-Debiaser-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The newsmediabias/UnBIAS-LLama2-Debiaser-Chat is a 7 billion parameter Llama 2-based causal language model developed by newsmediabias, specifically fine-tuned for text debiasing. This model excels at rephrasing input text to remove age, gender, political, social, or socio-economic biases. With a context length of 4096 tokens, it is designed to provide debiased versions of text without additional outputs.

Loading preview...