mncai/Mistral-7B-v0.1-orca-2k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

mncai/Mistral-7B-v0.1-orca-2k is a 7 billion parameter language model developed by Minds And Company, fine-tuned from Mistral-7B-v0.1. This model is instruction-tuned using the OpenOrca-KO dataset, leveraging a Llama Prompt Template. It is designed for general language generation tasks, with a focus on responses aligned with the Orca methodology.

Loading preview...