JDRIJKE/Qwen2.5-0.5B_russian_debias
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

JDRIJKE/Qwen2.5-0.5B_russian_debias is a 0.5 billion parameter language model based on the Qwen2.5 architecture, with a context length of 32768 tokens. This model is specifically fine-tuned for Russian language processing, focusing on debiasing. Its primary application is in generating less biased text in Russian, making it suitable for sensitive applications requiring neutral language output.

Loading preview...