teknium/CollectiveCognition-v1-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 4, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Collective Cognition v1 is a 7 billion parameter Mistral-based language model developed by teknium, fine-tuned using only 100 high-quality GPT-4 chat examples. This model demonstrates exceptional performance on the TruthfulQA benchmark, competing with 70B scale models despite its small training dataset and efficient QLoRA training. It is optimized for generating truthful and accurate responses, making it suitable for applications requiring high factual integrity.

Loading preview...