ajtaltarabukin2022/sonnet1

TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 19, 2026Architecture:Transformer Cold

ajtaltarabukin2022/sonnet1 is a 32 billion parameter language model created by ajtaltarabukin2022, based on the Qwen3-32B architecture. This model is a merge of several pre-trained language models using the DARE TIES method, designed to combine their strengths. It features a 32768 token context length, making it suitable for tasks requiring extensive contextual understanding. Its primary differentiator lies in its merged architecture, aiming for enhanced performance across various language generation tasks.

Loading preview...

Model Overview

ajtaltarabukin2022/sonnet1 is a 32 billion parameter language model developed by ajtaltarabukin2022. It is built upon the Qwen3-32B base model and leverages the DARE TIES merge method, a technique designed to combine the capabilities of multiple pre-trained models. This approach aims to synthesize diverse strengths into a single, more robust model.

Merge Details

The model was constructed by merging three distinct affine models:

  • michael-chan-000/affine-5Eh8v9zUpcBwNLRzE3bRv2FFhnaNPERRLdvEH8SdwLiahUh8
  • leary-comos/affine-5CFnCUCy5jDjXFQJV5L58Wi8wwyp1b9Xe2fQ9iaSfiFdkR1X
  • fakemoonlo/Affine-5FnfLT3ntQXDsAnVC5H5WNQYVTY7SSCbxU3kxqhNybtJeNGb

Each of these models contributed to the final sonnet1 through a weighted combination across their layers, as configured in the mergekit YAML. The base model, Qwen3-32B, was also included in the merge process. This intricate merging strategy allows for the potential integration of specialized knowledge or improved generalization from its constituent models.

Key Characteristics

  • Architecture: Based on Qwen3-32B.
  • Parameter Count: 32 billion parameters.
  • Context Length: Supports a substantial context window of 32768 tokens, beneficial for processing longer texts and maintaining conversational coherence.
  • Merge Method: Utilizes the DARE TIES method for combining models, indicating a focus on efficient and effective parameter merging.

Potential Use Cases

Given its large parameter count and extended context window, ajtaltarabukin2022/sonnet1 is well-suited for applications requiring deep contextual understanding and generation, such as:

  • Advanced content creation and summarization.
  • Complex question answering.
  • Long-form dialogue and conversational AI.
  • Tasks benefiting from the combined strengths of its merged components.