Alelcv27/Llama3.2-3B-BreadcrumbsTIES-Math-Code

TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 19, 2026Architecture:Transformer Cold

Alelcv27/Llama3.2-3B-BreadcrumbsTIES-Math-Code is a 3.2 billion parameter language model based on the Llama 3.2 architecture, created by Alelcv27. This model is a merge of specialized base models, utilizing the Breadcrumbs with TIES method to combine mathematical and coding capabilities. It is specifically optimized for tasks requiring strong performance in both mathematics and code generation, offering a 32768 token context length.

Loading preview...

Model Overview

Alelcv27/Llama3.2-3B-BreadcrumbsTIES-Math-Code is a 3.2 billion parameter language model derived from the Llama 3.2 base architecture. It was created by Alelcv27 using the mergekit tool, specifically employing the Model Breadcrumbs with TIES merge method.

Key Capabilities

This model is a strategic merge of two specialized base models:

  • Alelcv27/Llama3.2-3B-Base-Math: Contributes to enhanced mathematical reasoning and problem-solving.
  • Alelcv27/Llama3.2-3B-Base-Code: Provides strong capabilities in code generation, understanding, and related tasks.

The merge configuration weighted the mathematical component at 60% density and the coding component at 40% density, aiming for a balanced performance across both domains. The model maintains a substantial context length of 32768 tokens.

Ideal Use Cases

  • Mathematical Problem Solving: Excels in tasks requiring numerical reasoning, equation solving, and mathematical concept understanding.
  • Code Generation and Analysis: Suitable for generating code snippets, debugging, explaining code, and other programming-related applications.
  • Hybrid Applications: Particularly effective in scenarios where both mathematical precision and coding proficiency are required, such as scientific computing or data analysis scripting.