ivangrapher/merged_champion_v2

TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 11, 2026Architecture:Transformer Cold

The ivangrapher/merged_champion_v2 is a 32 billion parameter language model created by ivangrapher, formed by merging multiple pre-trained models using the DARE TIES method. This model integrates components from several 'affine' models, with dura-lori/affine-5ED5dwT4fztHjgjyR6vXpbGfnooeuWfr3VueaZrrfWJSou7y serving as the base. It is designed for general language generation tasks, leveraging the combined strengths of its constituent models.

Loading preview...

Model Overview

The ivangrapher/merged_champion_v2 is a 32 billion parameter language model developed by ivangrapher. It was constructed using the MergeKit tool, specifically employing the DARE TIES merge method.

Merge Details

This model is a composite of several pre-trained 'affine' models, with dura-lori/affine-5ED5dwT4fztHjgjyR6vXpbGfnooeuWfr3VueaZrrfWJSou7y serving as the foundational base model. The merge process combined contributions from:

  • catKnowCoffiee/Affine2-5EPhxsSDWnNzYjZdupuC5WLi2a5M8FYfnkvo5ukWM8Yge9zi
  • dura-lori/affine-5FcYc4MZ2z9yfFp6qPBQQjtS3cXkDV7x46ZUcoUP3pFRGoj4
  • leary-comos/affine-5CSqun1nmHbJQuvxyvJ534ZBpbFUUT1hoWXAuj18k7Qs7g2R

The DARE TIES method was applied with specific weighting for each contributing model's layers, aiming to consolidate their learned representations into a single, more capable model. The configuration specified a density of 0.3 and normalization of 1.0 for the merged parameters.

Intended Use

As a merged model, merged_champion_v2 is suitable for a broad range of natural language processing tasks, benefiting from the diverse capabilities inherited from its constituent models. Its 32B parameter count and 32768 token context length suggest potential for handling complex prompts and generating detailed responses.