CYFRAGOVPL/PLLuM-12B-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 7, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

CYFRAGOVPL/PLLuM-12B-instruct is a 12 billion parameter instruction-tuned large language model developed by a consortium of Polish scientific institutions, based on Mistral-Nemo-Base-2407. It is specialized in Polish and other Slavic/Baltic languages, refined through extensive instruction tuning and preference learning on high-quality Polish data. This model excels at generating contextually coherent text and assisting in tasks like question answering and summarization, particularly for Polish public administration and general language tasks.

Loading preview...