LeoZotos/Qwen2.5-0.5B_debiased
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 6, 2026Architecture:Transformer Warm
LeoZotos/Qwen2.5-0.5B_debiased is a 0.5 billion parameter Qwen2.5 model, fine-tuned on 50,000 samples from the facebook/panda (perturbed) dataset. This model is specifically designed to address and mitigate biases present in language models. With a context length of 32768 tokens, it is optimized for applications requiring reduced bias in text generation and understanding.
Loading preview...