uukuguy/speechless-thoughts-mistral-7b-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 15, 2024License:llama2Architecture:Transformer0.0K Open Weights Cold

The uukuguy/speechless-thoughts-mistral-7b-v1.0 is a 7 billion parameter Mistral-based causal language model, fine-tuned by uukuguy with a context length of 8192 tokens. This model is specifically optimized for reasoning, planning, and coding tasks, leveraging a diverse dataset including Airoboros, OpenOrca, and various code-centric datasets. It serves as a baseline for the larger speechless-sparsetral-16x7b-MoE model, demonstrating strong performance in areas like MMLU and HellaSwag.

Loading preview...