ABDUL-HASEEB-TANOLI/HAIDER-Math-32B-v1

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 30, 2026Architecture:Transformer0.0K Cold

ABDUL-HASEEB-TANOLI/HAIDER-Math-32B-v1 is a 32.8 billion parameter language model created by ABDUL-HASEEB-TANOLI, merged using the Task Arithmetic method. It combines components from qwq-32b and deepseek-r1-32b, based on qwen-32b, to potentially enhance mathematical reasoning capabilities. This model is designed for tasks requiring robust numerical and logical processing within a 32768 token context window.

Loading preview...

Model Overview

ABDUL-HASEEB-TANOLI/HAIDER-Math-32B-v1 is a 32.8 billion parameter language model developed by ABDUL-HASEEB-TANOLI. This model was created using the Task Arithmetic merge method, combining specific pre-trained language models to potentially enhance its capabilities, particularly in mathematical domains.

Merge Details

The model's architecture is based on /home/azureuser/haider_project/models/qwen-32b as the base model. It integrates components from two distinct models:

  • /home/azureuser/haider_project/models/qwq-32b
  • /home/azureuser/haider_project/models/deepseek-r1-32b

The merge process utilized a weight of 0.5 and normalized parameters, with the entire operation configured for bfloat16 data type. This specific combination suggests an intent to leverage the strengths of the merged models for specialized tasks.

Potential Use Cases

Given the model's name and the components involved, it is likely optimized for:

  • Mathematical problem-solving: Handling complex equations, proofs, and numerical reasoning.
  • Technical text analysis: Processing and generating content related to scientific or engineering fields.
  • Logical inference tasks: Where precise and structured reasoning is paramount.