LeoZotos/Qwen2.5-0.5B_debiased
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 6, 2026Architecture:Transformer Warm
LeoZotos/Qwen2.5-0.5B_debiased is a 0.5 billion parameter Qwen2.5 model, fine-tuned on 50,000 samples from the facebook/panda (perturbed) dataset. This model is specifically designed to address and mitigate biases present in language models. With a context length of 32768 tokens, it is optimized for applications requiring reduced bias in text generation and understanding.
Loading preview...
Model Overview
LeoZotos/Qwen2.5-0.5B_debiased is a compact yet capable language model based on the Qwen2.5 architecture, featuring 0.5 billion parameters. Its primary distinction lies in its specialized fine-tuning process, which involved 50,000 samples from the facebook/panda (perturbed) dataset.
Key Capabilities
- Bias Mitigation: Specifically trained to reduce and address biases, making it suitable for sensitive applications.
- Qwen2.5 Architecture: Leverages the robust foundation of the Qwen2.5 model family.
- Efficient Size: At 0.5 billion parameters, it offers a balance between performance and computational efficiency.
- Extended Context: Supports a context length of 32768 tokens, allowing for processing longer inputs.
Good For
- Bias-aware applications: Ideal for use cases where reducing model bias is a critical requirement.
- Resource-constrained environments: Its smaller size makes it suitable for deployment in environments with limited computational resources.
- Text generation and analysis: Can be applied to various natural language processing tasks where debiased outputs are preferred.