LVSTCK/domestic-yak-8B-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 14, 2025License:llama3.1Architecture:Transformer Cold
LVSTCK/domestic-yak-8B-instruct is an 8 billion parameter instruction-tuned language model developed by LVSTCK, specifically optimized for the Macedonian language. This model excels at instruction-following tasks, making it suitable for chatbots and virtual assistants. It demonstrates competitive performance, even on par with larger models like Llama 70B on several Macedonian benchmarks, and is currently the best in its 8B parameter range for this language.
Loading preview...