mncai/Mistral-7B-v0.1-orca-1k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

The mncai/Mistral-7B-v0.1-orca-1k model, developed by Minds And Company, is a fine-tuned variant of the Mistral-7B-v0.1 backbone. This model is specifically trained on the kyujinpy/OpenOrca-KO dataset, utilizing the Llama Prompt Template. It is designed for general language generation tasks, leveraging its Mistral architecture for efficient performance.

Loading preview...