CYFRAGOVPL/PLLuM-12B-nc-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 7, 2025License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

CYFRAGOVPL/PLLuM-12B-nc-instruct is a 12 billion parameter instruction-tuned large language model developed by a consortium of Polish scientific institutions, led by Politechnika Wrocławska. Built on Mistral-Nemo-Base-2407, it is specialized for Polish and other Slavic/Baltic languages, with a 32768 token context length. This model excels at generating contextually coherent text, question answering, and summarization in Polish, making it ideal for domain-specific applications, particularly within Polish public administration.

Loading preview...