mncai/Mistral-7B-CollectiveCognition-OpenOrca-1k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Oct 20, 2023License:mitArchitecture:Transformer Open Weights Cold

mncai/Mistral-7B-CollectiveCognition-OpenOrca-1k is an 8 billion parameter language model developed by Minds And Company, built upon the Mistral-7B-v0.1 backbone. This model is fine-tuned using the CollectiveCognition/chats-data-2023-09-27 dataset and utilizes the Llama Prompt Template. It is designed for general conversational AI tasks, leveraging its Mistral architecture for efficient processing.

Loading preview...