knifeayumu/Cydonia-v4-MS3.2-Magnum-Diamond-24B

Warm
Public
24B
FP8
32768
Jul 23, 2025
License: apache-2.0
Hugging Face
Overview

Overview

Cydonia-v4-MS3.2-Magnum-Diamond-24B is a 24 billion parameter language model developed by knifeayumu. It is a merged model, combining the strengths of two pre-trained language models: TheDrummer/Cydonia-24B-v4 and Doctor-Shotgun/MS3.2-24B-Magnum-Diamond. The merge was performed using the SLERP (Spherical Linear Interpolation) method, a technique often used to combine model weights smoothly.

Key Characteristics

  • Parameter Count: 24 billion parameters.
  • Merge Method: Utilizes the SLERP merge method for combining model weights.
  • Base Model: TheDrummer/Cydonia-24B-v4 served as the base model for the merge.
  • Refinement Focus: The primary motivation for this merge was to mitigate the "horny and verbose" tendencies observed in the Doctor-Shotgun/MS3.2-24B-Magnum-Diamond component, aiming for more balanced and controlled output.

Intended Use Cases

This model is suitable for general language generation tasks where a refined and less verbose output is desired compared to its more uninhibited constituent models. It aims to provide a more tempered response profile, making it potentially useful for applications requiring more neutral or controlled text generation.