indischepartij/MiaLatte-Indo-Mistral-7b
MiaLatte-Indo-Mistral-7b by indischepartij is a 7 billion parameter language model derived from OpenMia, specifically designed to answer everyday questions in Bahasa Indonesia. This model is a merge of OpenMia-Indo-Mistral-7b-v2, Kesehatan-7B-v0.1, and WestSeverus-7B-DPO-v2, utilizing a dare_ties merge method. It excels in Indonesian language understanding and generation, making it suitable for applications requiring localized conversational AI. The model achieves an average score of 67.86 on the Open LLM Benchmark, with a context length of 4096 tokens.
Loading preview...
MiaLatte-Indo-Mistral-7b: Indonesian Language Model
MiaLatte-Indo-Mistral-7b is a 7 billion parameter language model developed by indischepartij, specifically optimized for the Indonesian language. It is a derivative of the OpenMia model, enhanced through a merge of three distinct models using the MergeKit's dare_ties method.
Key Capabilities
- Indonesian Language Proficiency: Designed to answer everyday questions specifically in Bahasa Indonesia.
- Merged Architecture: Combines
indischepartij/OpenMia-Indo-Mistral-7b-v2,Obrolin/Kesehatan-7B-v0.1, andFelixChao/WestSeverus-7B-DPO-v2to leverage their respective strengths. - Performance: Achieves an average score of 67.86 on the Open LLM Benchmark, including 66.55 on AI2 Reasoning Challenge and 63.93 on MMLU.
Good For
- Applications requiring robust natural language understanding and generation in Bahasa Indonesia.
- Building chatbots or conversational agents for Indonesian-speaking users.
- Tasks involving general knowledge and everyday queries in an Indonesian context.