mncai/Mistral-7B-v0.1-alpaca-2k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

mncai/Mistral-7B-v0.1-alpaca-2k is a 7 billion parameter language model developed by Minds And Company, fine-tuned from Mistral-7B-v0.1. This model is instruction-tuned using the KoAlpaca-v1.1av dataset and utilizes the Llama Prompt Template. It is designed for general language generation tasks, leveraging its Mistral backbone for efficient performance.

Loading preview...