mergekit-community/Irix-12B_Slush_V2

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 14, 2025Architecture:Transformer0.0K Cold

Irix-12B_Slush_V2 is a 12 billion parameter language model created by mergekit-community, formed by merging DreadPoor/Irix-12B-Model_Stock and mergekit-community/Slush-Lyra-Gutenberg-Bophades using the SLERP method. This model leverages the combined strengths of its constituent models, offering a versatile base for various natural language processing tasks. With a 32768 token context length, it is suitable for applications requiring extensive contextual understanding.

Loading preview...

Overview

Irix-12B_Slush_V2 is a 12 billion parameter language model developed by mergekit-community. It was created using the SLERP (Spherical Linear Interpolation) merge method, combining two distinct base models to achieve enhanced capabilities.

Merge Details

This model is a merge of:

  • DreadPoor/Irix-12B-Model_Stock
  • mergekit-community/Slush-Lyra-Gutenberg-Bophades

The merging process utilized a specific configuration with a t parameter array of [0.5, 0.5, 0.5, 0.5, 0.5], indicating an equal weighting between the merged models. The base model for the merge was mergekit-community/Slush-Lyra-Gutenberg-Bophades, and the process was conducted using bfloat16 data type.

Key Characteristics

  • Parameter Count: 12 billion parameters.
  • Merge Method: SLERP, known for preserving the strengths of constituent models.
  • Context Length: 32768 tokens, supporting long-form content generation and understanding.

Use Cases

Given its merged nature and substantial context window, Irix-12B_Slush_V2 is suitable for a range of applications, including:

  • General text generation and completion.
  • Tasks requiring deep contextual understanding over long passages.
  • As a foundational model for further fine-tuning on specific downstream tasks.