sophosympatheia/Aurora-Nights-70B-v1.0

TEXT GENERATIONConcurrency Cost:4Model Size:69BQuant:FP8Ctx Length:32kPublished:Dec 23, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

sophosympatheia/Aurora-Nights-70B-v1.0 is a 69 billion parameter language model, a blend of Tulu-2-DPO-70B, Xwin-LM-70B-V0.1, and Opus-v0.5-70b, further merged with fiction.live-Kimiko-V2-70B. This model excels at instruction following and generating creative, uncensored storytelling and roleplaying content. It is specifically designed for immersive narrative experiences and character interaction, offering a 32768 token context length.

Loading preview...

Model Overview

sophosympatheia/Aurora-Nights-70B-v1.0 is a 69 billion parameter language model created by sophosympatheia. It is a sophisticated blend of several base models, including allenai/tulu-2-dpo-70b, Xwin-LM/Xwin-LM-70B-V0.1, and dreamgen/opus-v0.5-70b, with an additional merge of nRuaif/fiction.live-Kimiko-V2-70B. This unique combination aims to leverage the strengths of its components.

Key Capabilities

  • Instruction Following: Demonstrates strong ability to adhere to given instructions.
  • Creative Storytelling: Excels at generating imaginative and coherent narratives.
  • Roleplaying: Highly capable in producing engaging and uncensored roleplaying content.
  • Uncensored Output: Designed to generate content without inherent censorship, making it suitable for diverse creative applications.

Performance & Usage

The model achieves an average score of 73.77 on the Open LLM Leaderboard, with notable scores such as 70.47 on MMLU (5-Shot) and 88.33 on HellaSwag (10-Shot). While primarily optimized for roleplaying and storytelling, it is expected to perform well in other general language tasks. Users are advised to experiment with sampler settings like Quadratic Sampling (smoothing factor 0.2-0.5) and Min-P (0.05-0.9) for optimal output, and to customize system prompts for desired behavior. The recommended maximum context for coherence is around 6144 tokens, though it supports up to 32768 tokens.

Licensing

This model inherits licenses from its base components, including Llama2 and the Impact License from AllenAI for Tulu. Users should consult legal counsel regarding the intersection of these licenses for any use beyond private applications.