mncai/Mistral-7B-CollectiveCognition
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Oct 20, 2023License:mitArchitecture:Transformer Open Weights Cold

The mncai/Mistral-7B-CollectiveCognition model is an 8 billion parameter language model developed by Minds And Company, built upon the Mistral-7B-v0.1 backbone. Fine-tuned on the CollectiveCognition/chats-data-2023-09-27 dataset, it utilizes the Llama Prompt Template and has an 8192 token context length. This model is designed for chat-based applications, leveraging its fine-tuning for conversational interactions.

Loading preview...