sophosympatheia/Midnight-Miqu-70B-v1.5

TEXT GENERATIONConcurrency Cost:4Model Size:69BQuant:FP8Ctx Length:32kPublished:Mar 11, 2024License:otherArchitecture:Transformer0.3K Cold

Midnight-Miqu-70B-v1.5 by sophosympatheia is a 69 billion parameter DARE Linear merge model, based on a leaked Mistral model, with a 32K token context length. It is specifically designed and optimized for uncensored roleplaying and storytelling tasks. This version improves upon Midnight Miqu v1.0 by incorporating elements from Tess-70B-v1.6, enhancing performance in specific tests without sacrificing writing quality.

Loading preview...

Model Overview

sophosympatheia/Midnight-Miqu-70B-v1.5 is a 69 billion parameter language model, a DARE Linear merge of sophosympatheia/Midnight-Miqu-70B-v1.0 and migtissera/Tess-70B-v1.6. Built on a base derived from a leaked Mistral model, it offers a 32K token context length. This iteration maintains the core performance and "feel" of v1.0 while showing improvements in certain internal tests.

Key Capabilities & Features

  • Optimized for Creative Tasks: Specifically designed for uncensored roleplaying and storytelling, with noted strong performance in these areas.
  • Uncensored Output: The model is uncensored, allowing for a wide range of creative expression (users are responsible for their usage).
  • Long Context Support: Capable of handling up to 32K tokens with alpha_rope set to 1.
  • Sampler & Prompting Guidance: The README provides detailed recommendations for sampler settings (e.g., Quadratic Sampling, Min-P) and prompting strategies, including specific system prompts and context templates for platforms like SillyTavern, to maximize creative output.

Usage Considerations

  • Personal Use Only: Due to its lineage from a leaked Mistral model, this merge is explicitly stated as suitable only for personal use.
  • "Warming Up": The model may require initial "warming up" with few-shot prompting and descriptive system messages to achieve desired writing quality at the start of a new chat.
  • Instruct Formats: Recommends Vicuna and Mistral instruct formats, with Vicuna being the preferred option.