newsmediabias/MBIAS
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 18, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

MBIAS is a 7 billion parameter Large Language Model developed by Ananya Raval, Veronica Chatrath, and Shaina Raza, specifically fine-tuned to enhance safety by significantly reducing bias and toxicity in text generation. It uniquely focuses on retaining high contextual accuracy and knowledge retention while mitigating harmful outputs. This model is primarily intended for research and development in applications requiring robust bias and toxicity reduction without compromising contextual meaning.

Loading preview...