CYFRAGOVPL/PLLuM-12B-instruct is a 12 billion parameter instruction-tuned large language model developed by a consortium of Polish scientific institutions, based on Mistral-Nemo-Base-2407. It is specialized in Polish and other Slavic/Baltic languages, refined through extensive instruction tuning and preference learning on high-quality Polish data. This model excels at generating contextually coherent text and assisting in tasks like question answering and summarization, particularly for Polish public administration and general language tasks.
No reviews yet. Be the first to review!