sarahlintang/mistral-indo-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Oct 17, 2023License:apache-2.0Architecture:Transformer Open Weights Warm
The sarahlintang/mistral-indo-7b model is a 7 billion parameter language model fine-tuned from Mistral 7B v0.1. Developed by sarahlintang, this model specializes in understanding and generating text based on Indonesian instructions. It leverages an 8192-token context length and is optimized for tasks requiring Indonesian language proficiency. Its primary use case is instruction-following in Indonesian.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p