liminerity/Mistral-quiet-star-demo
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

liminerity/Mistral-quiet-star-demo is a 7 billion parameter language model developed by liminerity, fine-tuned from unsloth/mistral-7b-bnb-4bit. This model is designed to enhance reasoning capabilities by encouraging a 'think before speaking' approach, utilizing an Alpaca-based dataset. It demonstrates potential for low-cost AGI systems by focusing on reasoning without specialized architectures, making it suitable for tasks requiring structured thought processes.

Loading preview...