mncai/Mistral-7B-v0.1-orca_platy-2k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

The mncai/Mistral-7B-v0.1-orca_platy-2k model is a fine-tuned variant of the Mistral-7B-v0.1 backbone developed by Minds And Company. This model leverages the Mistral architecture and is fine-tuned on a combination of the kyujinpy/KOpen-platypus and kyujinpy/OpenOrca-KO datasets, utilizing the Llama Prompt Template. It is designed for general language understanding and generation tasks, building upon the capabilities of its Mistral base.

Loading preview...