Alelcv27/Llama3.1-8B-Arcee-Math-Code-v3

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 4, 2026Architecture:Transformer Cold

Alelcv27/Llama3.1-8B-Arcee-Math-Code-v3 is an 8 billion parameter language model based on the Llama 3.1 architecture. Developed by Alelcv27 using the Arcee Fusion merge method, this model combines specialized capabilities from a math-focused Llama 3.1 base with a code-optimized Llama 3.1 variant. It is designed to excel in both mathematical reasoning and code generation tasks, offering a versatile solution for technical applications.

Loading preview...

Alelcv27/Llama3.1-8B-Arcee-Math-Code-v3 Overview

This model is an 8 billion parameter language model built upon the Llama 3.1 architecture. It was created by Alelcv27 using the Arcee Fusion merge method via mergekit.

Key Capabilities

  • Dual Specialization: The model integrates capabilities from two distinct Llama 3.1 variants:
    • A base model, Alelcv27/Llama3.1-8B-Math-v3, providing strong mathematical reasoning abilities.
    • An additional model, Alelcv27/Llama3.1-8B-Code-v2, enhancing its proficiency in code generation and understanding.
  • Merged Architecture: The fusion process combined layers from both specialized models across the full 32 layers, aiming for a balanced performance in both domains.

Good For

  • Mathematical Problem Solving: Ideal for tasks requiring numerical reasoning, equation solving, and other math-intensive applications.
  • Code Generation and Analysis: Suitable for generating programming code, assisting with debugging, or understanding code logic across various languages.
  • Hybrid Technical Tasks: Useful in scenarios where both strong mathematical comprehension and coding skills are simultaneously required.