uukuguy/speechless-thoughts-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 13, 2024License:llama2Architecture:Transformer0.0K Open Weights Cold
uukuguy/speechless-thoughts-mistral-7b is a 7 billion parameter Mistral-based causal language model fine-tuned by uukuguy. It serves as a baseline for the speechless-sparsetral-16x7b-MoE model, focusing on coding, reasoning, and planning tasks. The model was trained on a diverse dataset including filtered categories from jondurbin/airoboros-2.2, Open-Orca, Open-Platypus, WizardLM, and Python-specific datasets. It is optimized for tasks requiring strong logical and programming capabilities.
Loading preview...