sarahlintang/mistral-indo-7b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Oct 17, 2023License:apache-2.0Architecture:Transformer Open Weights Warm

The sarahlintang/mistral-indo-7b model is a 7 billion parameter language model fine-tuned from Mistral 7B v0.1. Developed by sarahlintang, this model specializes in understanding and generating text based on Indonesian instructions. It leverages an 8192-token context length and is optimized for tasks requiring Indonesian language proficiency. Its primary use case is instruction-following in Indonesian.

Loading preview...

Model Overview

sarahlintang/mistral-indo-7b is a 7 billion parameter language model built upon the Mistral 7B v0.1 architecture. This model has been specifically fine-tuned using an Indonesian instruction dataset, making it proficient in processing and generating text in the Indonesian language. It supports a context length of 8192 tokens.

Key Capabilities

  • Indonesian Instruction Following: Excels at understanding and responding to instructions provided in Indonesian.
  • Text Generation: Capable of generating coherent and contextually relevant text in Indonesian.
  • Mistral 7B Foundation: Benefits from the robust base capabilities of the Mistral 7B v0.1 model.

Good For

  • Applications requiring Indonesian language processing.
  • Developing chatbots or virtual assistants that interact in Indonesian.
  • Tasks involving instruction-based text generation in Indonesian.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p