ajibawa-2023/General-Stories-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The ajibawa-2023/General-Stories-Mistral-7B is a 7 billion parameter language model, fine-tuned by ajibawa-2023, based on the Mistral-7B-v0.1 architecture. It was extensively trained for over 15 days on a 1.3 million story dataset, General-Stories-Collection, specifically curated for general audiences. This model excels at generating captivating narratives and is optimized for versatile storytelling across broad themes.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p