AuriAetherwiing/MN-12B-Starcannon-v2

4.7 based on 3 reviews
Warm
Public
12B
FP8
32768
License: cc-by-nc-nd-4.0
Hugging Face
Overview

MN-12B-Starcannon-v2 Overview

MN-12B-Starcannon-v2 is a 12 billion parameter language model developed by AuriAetherwiing, created through a merge of existing pre-trained models using the mergekit tool. This model specifically utilizes the TIES merge method, combining the characteristics of nothingiisreal/MN-12B-Celeste-V1.9 as its base and intervitens/mini-magnum-12b-v1.1.

Key Characteristics

  • Creative Writing Style: The model is noted for its creative output and verbose prose, offering a distinct narrative voice.
  • Blended Personalities: It combines elements from both Celeste v1.9 and mini-magnum-12b-v1.1, aiming for a balance that provides more variety than Magnum and more detailed writing than Celeste v1.9.
  • Merge Method: Built using the TIES (Task-Independent Ensemble of Subnetworks) merge method, which allows for combining multiple models effectively.

Use Cases

This model is particularly suited for applications requiring:

  • Creative Text Generation: Ideal for scenarios where unique and imaginative text is desired.
  • Narrative Development: Its verbose and varied writing style can be beneficial for generating stories, roleplay scenarios, or descriptive content.

Quantized versions are available, including Dynamic FP8, Static GGUF, and EXL2 formats.