lapa-llm/lapa-v0.1.2-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 18, 2025License:gemma Vision Architecture:Transformer0.0K Warm

Lapa LLM v0.1.2 is a 12 billion parameter open-source language model developed by a consortium of Ukrainian researchers, based on Gemma-3. It is specifically optimized for Ukrainian language processing, featuring a custom tokenizer that significantly reduces token count for Ukrainian text. This model excels in tasks such as English-to-Ukrainian translation, image processing in Ukrainian, summarization, and Q&A, making it highly efficient for RAG systems and culturally relevant text generation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p