cjvt/GaMS-27B-Instruct
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Apr 4, 2025License:gemmaArchitecture:Transformer0.0K Warm

cjvt/GaMS-27B-Instruct is a 27 billion parameter instruction-tuned language model developed by researchers at the University of Ljubljana, Faculty for Computer and Information Science. Based on Google's Gemma 2 family, it has been continually pretrained on Slovene, English, Croatian, Bosnian, and Serbian corpora. This model specializes in multilingual text generation and understanding, particularly excelling in Slovene language tasks and translation, with a context length of 32768 tokens. It is designed for applications requiring robust performance in these specific languages.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p