KaraKaraWitch/Llama-MagicalGirl

Warm
Public
70B
FP8
32768
Hugging Face
Overview

Model Overview

Llama-MagicalGirl is a 70 billion parameter language model developed by KaraKaraWitch, constructed through a merge of multiple pre-trained Llama-based models. The model utilizes the SCE (Sparse Component Extraction) merge method, building upon KaraKaraWitch/Llama-3.X-Workout-70B as its base. The merge incorporated models such as SicariusSicariiStuff/Negative_LLAMA_70B, TheDrummer/Nautilus-70B-v0.1, Steelskull/L3.3-Nevoria-R1-70b, and Tarek07/Inception-LLaMa-70B.

Key Characteristics

  • Merge Method: Employs the SCE merge method for combining diverse model strengths.
  • Base Model: Built on KaraKaraWitch/Llama-3.X-Workout-70B.
  • Creative Focus: Designed to fulfill specific creative generation requirements.
  • Sampling Recommendations: Optimal performance is achieved with specific sampling settings (Temperature: 1.4, Min P: 0.03) to enhance creativity and control verbosity.

Important Considerations

  • System Prompt Requirement: The model can produce offensive or dark content if used without a system prompt. Users are strongly advised to implement a simple system prompt to guide its behavior.

This model aims to provide a unique generative experience, particularly for users seeking specific creative outputs, provided appropriate safety measures like system prompts are in place.