sophosympatheia/Midnight-Miqu-70B-v1.0

TEXT GENERATIONConcurrency Cost:4Model Size:69BQuant:FP8Ctx Length:32kPublished:Feb 29, 2024License:otherArchitecture:Transformer0.1K Cold

The Midnight-Miqu-70B-v1.0 is a 69 billion parameter SLERP merge model by sophosympatheia, combining 152334H/miqu-1-70b-sf and sophosympatheia/Midnight-Rose-70B-v2.0.3. Designed for roleplaying and storytelling, it retains Midnight Rose's creative strengths while gaining Miqu's long-context capabilities, supporting up to 32K tokens. This uncensored model is optimized for creative text generation tasks.

Loading preview...

Midnight-Miqu-70B-v1.0: A Specialized Merge for Creative Text

Midnight-Miqu-70B-v1.0 is a 69 billion parameter language model created by sophosympatheia through a SLERP merge of two distinct models: 152334H/miqu-1-70b-sf and sophosympatheia/Midnight-Rose-70B-v2.0.3. This merge aims to combine the creative strengths of Midnight Rose with the enhanced long-context capabilities derived from Miqu.

Key Capabilities

  • Optimized for Roleplaying and Storytelling: The model is specifically designed and tested for these creative text generation tasks, aiming to provide engaging and coherent narratives.
  • Extended Context Window: It supports a context length of up to 32,768 tokens, with limited testing showing coherence even at 64,000 tokens using alpha_rope scaling.
  • Uncensored Output: This model is uncensored, offering flexibility for various creative applications, though users are responsible for its output.
  • Customizable Sampler and Prompting: The README provides detailed tips and JSON configurations for sampler settings (e.g., Quadratic Sampling, Min-P) and prompting strategies (e.g., specific context and system prompts for SillyTavern) to fine-tune its behavior.

Good For

  • Creative Writing: Ideal for generating detailed stories, character dialogues, and immersive roleplay scenarios.
  • Long-form Content Generation: Its extended context window makes it suitable for maintaining coherence over lengthy narratives.
  • Experimental Use: Developers interested in exploring model merges and fine-tuning creative outputs will find its detailed configuration and prompting tips useful.

Important Note on Licensing: This model is based on miqu-1-70b-sf, which itself is derived from a leaked version of a Mistral model. Consequently, Midnight-Miqu-70B-v1.0 is only suitable for personal use, and commercial or public deployment is strongly discouraged due to potential legal risks.