Alelcv27/Llama3.1-8B-Arcee-Math-Code-v2

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 4, 2026Architecture:Transformer Cold

Alelcv27/Llama3.1-8B-Arcee-Math-Code-v2 is an 8 billion parameter language model based on the Llama 3.1 architecture, created by Alelcv27 using the Arcee Fusion merge method. This model specifically combines a Llama 3.1-8B base with specialized Math and Code models. It is optimized for tasks requiring strong mathematical reasoning and code generation capabilities, offering an 8192 token context length.

Loading preview...

Overview

Alelcv27/Llama3.1-8B-Arcee-Math-Code-v2 is an 8 billion parameter language model developed by Alelcv27. It was created using the Arcee Fusion merge method, combining a Llama 3.1-8B base model with two specialized models: Alelcv27/Llama3.1-8B-Math-v2 and Alelcv27/Llama3.1-8B-Code-v2. This merging strategy aims to leverage the strengths of both mathematical reasoning and code generation within a single model.

Key Capabilities

  • Enhanced Mathematical Reasoning: Benefits from the integration of a dedicated math-focused model.
  • Improved Code Generation: Incorporates capabilities from a specialized code model.
  • Llama 3.1 Architecture: Built upon the robust Llama 3.1 foundation.
  • 8192 Token Context: Supports processing of moderately long inputs and outputs.

Good For

  • Mathematical Problem Solving: Ideal for applications requiring accurate numerical and logical reasoning.
  • Code Development: Suitable for tasks such as generating code snippets, debugging, or understanding programming logic.
  • Hybrid Applications: Useful in scenarios where both strong mathematical and coding abilities are required, such as scientific computing or data analysis scripting.