Gryphe/MythoLogic-Mini-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jul 28, 2023License:otherArchitecture:Transformer0.0K Cold

Gryphe/MythoLogic-Mini-7b is a 7 billion parameter language model built on a Llama-2 core, specifically augmented with Nous Hermes-2, Stable Beluga, and a Kimiko LoRa. This model is specifically optimized for enhanced roleplaying capabilities, making it suitable for interactive narrative generation and character-driven applications. It features a 4096-token context length and utilizes a unique gradient merge strategy to improve linguistic intricacy.

Loading preview...

MythoLogic-Mini-7b Overview

MythoLogic-Mini-7b is a 7 billion parameter language model developed by Gryphe, designed as a smaller counterpart to the Mytho series. It is built upon a Llama-2 core, specifically leveraging the strengths of Nous Hermes-2, further enhanced by Stable Beluga and a carefully distilled Kimiko LoRa.

Key Capabilities

  • Optimized for Roleplaying: Unlike many general-purpose 7B models, MythoLogic-Mini-7b places a strong emphasis on improving roleplaying aspects through its unique gradient merge strategy.
  • Hybrid Architecture: It combines the Hermes-2 core (strong at 90% initially) with Stable Beluga and Kimiko LoRa, which handle intricate linguistic details from the 12th layer onwards.
  • Alpaca Prompt Format: Best performance is achieved using the Alpaca instruction format, with specific recommendations for roleplay prompts.

Good For

  • Interactive Roleplay: Excels in generating character-driven responses and maintaining narrative consistency in roleplaying scenarios.
  • Creative Writing: Suitable for tasks requiring nuanced linguistic expression and character interaction.
  • Resource-Constrained Environments: As a 7B model, it offers a balance of capability and efficiency for deployment where larger models might be impractical.