nazimali/Mistral-Nemo-Kurdish is a 12 billion parameter language model, continued pre-trained by nazimali on the Mistral-Nemo-Instruct-2407 architecture. It is specifically optimized for improving Kurdish language understanding through pre-training on a Kurdish Wikipedia dataset. This quantized model is designed for further fine-tuning on specific tasks to leverage its enhanced Kurdish language capabilities.
No reviews yet. Be the first to review!