mncai/Mistral-7B-v0.1-platy-1k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer Open Weights Cold

mncai/Mistral-7B-v0.1-platy-1k is a 7 billion parameter language model developed by Minds And Company, fine-tuned from Mistral-7B-v0.1. This model is specialized for instruction following, leveraging datasets like kyujinpy/KOpen-platypus. It utilizes the Llama Prompt Template, making it suitable for applications requiring precise conversational responses and adherence to specific instructions.

Loading preview...