slovak-nlp/mistral-sk-7b
The slovak-nlp/mistral-sk-7b is a 7 billion parameter language model, a Slovak language version of Mistral-7B-v0.1. Developed by a collaboration including the Technical University of Košice and the Slovak Academy of Sciences, it was fine-tuned using the Araneum Slovacum VII Maximum web corpus. This base pre-trained model is designed for further fine-tuning on downstream tasks specifically in the Slovak language.
Loading preview...
Overview
mistral-sk-7b is a 7 billion parameter large language model, representing a Slovak language adaptation of the original Mistral-7B-v0.1. This model was developed through a collaborative effort involving the Department of Cybernetics and Artificial Intelligence at the Technical University of Košice, the Centre of Social and Psychological Sciences of the Slovak Academy of Sciences, and the Ľ. Štúr Institute of Linguistics, Slovak Academy of Sciences.
Key Characteristics
- Language: Primarily focused on Slovak.
- Base Model: Fine-tuned from Mistral-7B-v0.1.
- Training Data: Utilizes data from the Araneum Slovacum VII Maximum web corpus for its Slovak language adaptation.
- License: Distributed under the Apache License 2.0.
Intended Use
This model serves as a base pre-trained model specifically designed for further fine-tuning. Developers can leverage mistral-sk-7b for various downstream natural language processing tasks that require strong performance in the Slovak language. It is important to note that this model does not incorporate any built-in moderation mechanisms.