The burgasdotpro/bgGPT-DeepSeek-R1-Distill-Qwen-7B is a 7.6 billion parameter language model, based on the DeepSeek-R1-Distill-Qwen-7B architecture, developed by burgasdotpro. This model is specifically optimized for Bulgarian language processing, demonstrating significantly improved perplexity on both short and long Bulgarian texts compared to its base model. It excels in tasks requiring logical reasoning and step-by-step problem-solving in Bulgarian, making it suitable for applications needing robust Bulgarian language understanding and generation.
No reviews yet. Be the first to review!