cldersaienril/Instameta-Mistral-v0.1-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Oct 24, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The cldersaienril/Instameta-Mistral-v0.1-7b is a 7 billion parameter language model based on the Mistral architecture, featuring an 8192-token context length. It is fine-tuned on the Dolphin dataset, an open-source implementation of Microsoft's Orca, and includes a private dataset of Chinese GPT-4/GPT-3.5 dialogues. This model is designed to enhance multilingual capabilities, particularly in Chinese, and is suitable for general-purpose conversational AI tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p