newsmediabias/UnBIAS-LLama2-7B-Debias
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:openrailArchitecture:Transformer Open Weights Cold
The newsmediabias/UnBIAS-LLama2-7B-Debias is a 7 billion parameter language model, based on the Llama 2 architecture, specifically fine-tuned for debiasing tasks. This model leverages a specialized dataset to mitigate biases, making it suitable for applications requiring more neutral and objective text generation. It offers a 4096-token context window, focusing on improving fairness in AI outputs.
Loading preview...