ajtaltarabukin2022/merged_beat_champ_2model_slerp_champ

TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 18, 2026Architecture:Transformer Cold

The ajtaltarabukin2022/merged_beat_champ_2model_slerp_champ is a 32 billion parameter language model created by ajtaltarabukin2022 through a SLERP merge of two pre-trained Affine models. This model combines the characteristics of its constituent models, offering a blended performance profile. It is designed for general language tasks, leveraging the strengths of its merged components.

Loading preview...

Model Overview

The merged_beat_champ_2model_slerp_champ is a 32 billion parameter language model developed by ajtaltarabukin2022. It was created using the MergeKit tool, specifically employing the SLERP (Spherical Linear Interpolation) merge method.

Merge Details

This model is a composite of two distinct pre-trained language models:

  • fakemoonlo/Affine-5FnfLT3ntQXDsAnVC5H5WNQYVTY7SSCbxU3kxqhNybtJeNGb
  • dura-lori/affine-5DoKPQhZmKnFk4mNEmH4UorbqHDe3PFAPvEfJyDwNkimoAMe

The merge configuration applied a weight of 0.55 to the dura-lori model and 0.45 to the fakemoonlo model across layers 0 to 64, with a global SLERP parameter t set to 0.45. This method aims to combine the learned representations of the base models to achieve a new performance characteristic.

Potential Use Cases

As a merged model, its capabilities are expected to reflect a blend of its source models. It is suitable for general natural language processing tasks where a combination of the strengths of the original Affine models is desired, potentially offering improved generalization or specific task performance depending on the underlying characteristics of the merged components.