gordicaleksa/YugoGPT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
YugoGPT is a 7 billion parameter base language model developed by Aleksa Gordić, built upon the Mistral 7B architecture. It is specifically trained on tens of billions of tokens in Bosnian, Croatian, and Serbian (BCS) languages. This model is optimized to be the best open-source base LLM for BCS languages, demonstrating superior performance in Serbian language evaluations compared to general-purpose models like Mistral 7B and LLaMA 2 7B. Its primary use case is as a foundational model for applications requiring strong language understanding and generation in the BCS linguistic family.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–