CorticalStack/mistral-7b-dolphin-sft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 17, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

CorticalStack/mistral-7b-dolphin-sft is a 7 billion parameter language model, fine-tuned from unsloth/mistral-7b-bnb-4bit using the cognitivecomputations/dolphin dataset. This model specializes in instruction following, leveraging its SFT (Supervised Fine-Tuning) on a diverse conversational dataset. It is optimized for efficient deployment and inference, making it suitable for applications requiring responsive and accurate text generation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p