indischepartij/MiaLatte-Indo-Mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 2, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

MiaLatte-Indo-Mistral-7b by indischepartij is a 7 billion parameter language model derived from OpenMia, specifically designed to answer everyday questions in Bahasa Indonesia. This model is a merge of OpenMia-Indo-Mistral-7b-v2, Kesehatan-7B-v0.1, and WestSeverus-7B-DPO-v2, utilizing a dare_ties merge method. It excels in Indonesian language understanding and generation, making it suitable for applications requiring localized conversational AI. The model achieves an average score of 67.86 on the Open LLM Benchmark, with a context length of 4096 tokens.

Loading preview...