jenny08311/affine-test-1

TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026Architecture:Transformer Cold

jenny08311/affine-test-1 is a 32 billion parameter language model created by merging multiple pre-trained models using the TIES merge method, with Qwen/Qwen3-32B as its base. This model integrates contributions from five distinct 'Affine' models, combining their learned parameters. It is designed for general language tasks, leveraging the strengths of its constituent models and the Qwen3-32B architecture.

Loading preview...

Model Overview

jenny08311/affine-test-1 is a 32 billion parameter language model developed by merging several pre-trained models. It utilizes the TIES merge method and is built upon the Qwen/Qwen3-32B base model, inheriting its foundational capabilities.

Merge Details

This model is a composite of five different 'Affine' models, specifically:

  • roaringcat1/Affine-0327e2-5EcNJ9jwSeEaNKUKvQgZkoy345hxCZX9Dxh3Tay43Me4nhwN
  • catKnowCoffiee/Affine2-5EPhxsSDWnNzYjZdupuC5WLi2a5M8FYfnkvo5ukWM8Yge9zi
  • leary-comos/affine-5CXjrfQeeKoXErUY4jGysVsNqvLhry32LrToJnL7GmrVhFSE
  • axon1/affine_m19_5CJHUdkdDJkgb6wdE3ZEL8E7N88LsUhTgfztTWVnnnFsmh8d
  • oliverchang/Affine-95-5HL2tZAma8d9BAsqZWdFvhdjrxjqMyBZyPVKhknRtHESTKLe

The merging process involved specific parameter configurations for each slice of the model, adjusting density and weight filters for MLP and self-attention layers across different layer ranges (0-64). This intricate merging aims to combine the diverse strengths of the individual models into a single, more robust language model.

Intended Use

This model is suitable for general-purpose language generation and understanding tasks, benefiting from the collective knowledge embedded in its merged components and the strong base of Qwen3-32B.