KoboldAI/Mistral-7B-Holodeck-1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

KoboldAI/Mistral-7B-Holodeck-1 is a 7 billion parameter language model, fine-tuned from Mistral's 7B architecture. This model is specifically optimized for creative writing and narrative generation, having been trained on a diverse dataset of approximately 3000 ebooks. It features a context length of 8192 tokens and excels at generating text within specified genres due to its unique training methodology.

Loading preview...

KoboldAI/Mistral-7B-Holodeck-1 Overview

KoboldAI/Mistral-7B-Holodeck-1 is a 7 billion parameter language model built upon the Mistral 7B architecture. This model has undergone a specialized fine-tuning process, distinguishing it from other general-purpose LLMs.

Key Capabilities

  • Genre-Specific Text Generation: The model's training involved a dataset of around 3000 ebooks across various genres. A unique aspect of its training data is the prepending of [Genre: <genre1>, <genre2>] tags, which likely enables more controlled and genre-consistent output.
  • Narrative and Creative Writing: Its extensive training on diverse ebook content positions it as a strong candidate for tasks requiring creative storytelling, character development, and immersive narrative generation.
  • 8192 Token Context Window: Offers a substantial context length, allowing for the generation and understanding of longer passages of text, crucial for complex narratives.

Good For

  • Creative Writing Assistance: Ideal for authors, role-players, and content creators looking for AI assistance in generating stories, dialogues, or descriptive passages.
  • Genre Exploration: Users can potentially leverage the model's genre-tagged training to guide its output towards specific literary styles or themes.

Limitations

As with many NLP technologies, this model may exhibit biases related to gender, profession, race, and religion, stemming from its training data.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p