Zachary1150/math_merge_linear_1.5B
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 20, 2026Architecture:Transformer Warm

Zachary1150/math_merge_linear_1.5B is a 1.5 billion parameter language model created by Zachary1150 using the Linear merge method. It combines Zachary1150/math_acc_1.5B and Zachary1150/math_len_1.5B, suggesting an optimization for mathematical accuracy and potentially handling longer mathematical contexts. This model is primarily designed for tasks requiring robust mathematical reasoning and problem-solving capabilities.

Loading preview...

Overview

Zachary1150/math_merge_linear_1.5B is a 1.5 billion parameter language model developed by Zachary1150. It was created using the Linear merge method via mergekit, combining two specialized base models: Zachary1150/math_acc_1.5B and Zachary1150/math_len_1.5B. This strategic merge aims to leverage the strengths of both components, likely focusing on enhancing mathematical accuracy and the ability to process extended mathematical problem descriptions.

Key Capabilities

  • Enhanced Mathematical Reasoning: By merging models focused on 'accuracy' and 'length' in a mathematical context, this model is designed to perform well on complex math problems.
  • Optimized for Specific Math Tasks: The underlying components suggest a specialization in numerical and logical mathematical operations.

Good for

  • Mathematical problem-solving: Ideal for applications requiring precise calculations and logical deduction.
  • Educational tools: Can be used in systems that assist with or generate math exercises and solutions.
  • Research in model merging: Provides a practical example of the Linear merge method applied to domain-specific models.