ChaoticNeutrals/Captain-Eris_Violet_Toxic-Magnum-12B

Warm
Public
12B
FP8
32768
Hugging Face
Overview

Model Overview

ChaoticNeutrals/Captain-Eris_Violet_Toxic-Magnum-12B is a 12 billion parameter language model developed by ChaoticNeutrals. It was created through a sophisticated merge process using mergekit, specifically employing the SLERP (Spherical Linear Interpolation) method to combine the weights of two distinct base models.

Key Merge Details

  • Base Models: The merge integrates capabilities from:
  • Merge Method: SLERP, which is known for producing coherent blends of model weights.
  • Configuration: The merge process involved specific parameter weighting across different layers (self_attn and mlp) to optimize the combined model's performance, with a global weighting factor of 0.420 and bfloat16 precision.

Intended Use Cases

This merged model is suitable for a variety of general-purpose text generation and understanding tasks, benefiting from the combined strengths of its constituent models. Its 12B parameters and 32768-token context window make it a capable option for applications requiring substantial context processing.