mncai/Mistral-7B-v0.1-orca_platy-1k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

The mncai/Mistral-7B-v0.1-orca_platy-1k model, developed by Minds And Company, is a fine-tuned variant based on the Mistral-7B-v0.1 backbone. This model leverages datasets like kyujinpy/KOpen-platypus and kyujinpy/OpenOrca-KO, utilizing the Llama Prompt Template for its instruction-following capabilities. It is designed for general language tasks, building upon its Mistral foundation with specific instruction tuning.

Loading preview...