mncai/Mistral-7B-v0.1-platy-2k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

mncai/Mistral-7B-v0.1-platy-2k is a 7 billion parameter language model developed by Minds And Company, fine-tuned from Mistral-7B-v0.1. It leverages the kyujinpy/KOpen-platypus dataset and uses the Llama Prompt Template. This model is optimized for tasks requiring strong instruction following and reasoning, building upon the Mistral architecture.

Loading preview...