trajkovnikola/MKLLM-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 16, 2024License:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

trajkovnikola/MKLLM-7B-Instruct is a 7 billion parameter instruction-tuned language model developed by trajkovnikola, built upon Mistral-7B-v0.1. It is specifically optimized for the Macedonian language through continued pretraining on a Macedonian and English text corpus. This model excels in understanding and processing Macedonian, outperforming larger models like Llama3-8B-Instruct and Mistral-7B-Instruct-v0.3 on Macedonian benchmarks.

Loading preview...