LVSTCK/domestic-yak-8B-instruct
LVSTCK/domestic-yak-8B-instruct is an 8 billion parameter instruction-tuned language model developed by LVSTCK, specifically optimized for the Macedonian language. This model excels at instruction-following tasks, making it suitable for chatbots and virtual assistants. It demonstrates competitive performance, even on par with larger models like Llama 70B on several Macedonian benchmarks, and is currently the best in its 8B parameter range for this language.
Loading preview...
domestic-yak-8B-instruct: A Macedonian Instruction-Tuned LLM
LVSTCK/domestic-yak-8B-instruct is an 8 billion parameter language model developed by LVSTCK, specifically fine-tuned for instruction-following in the Macedonian language. This model builds upon the domestic-yak-8B base and was trained for three epochs on the sft-mk dataset, which comprises approximately 100k samples covering various categories like QA, chat, reasoning, essays, and code.
Key Capabilities
- Macedonian Language Proficiency: Optimized for generating coherent and task-specific responses in Macedonian.
- Instruction Following: Enhanced capabilities for understanding and executing user instructions, making it suitable for interactive applications.
- Competitive Performance: Benchmarks show it performs on par with, and in some cases surpasses, larger models like Llama 70B on Macedonian-specific evaluations, establishing it as a leading 8B parameter model for the language.
Good for
- Developing chatbots and virtual assistants for Macedonian speakers.
- Applications requiring task-specific responses and instruction adherence in Macedonian.
- Research and development in low-resource language NLP, particularly for Macedonian.