Alelcv27/Llama3.1-8B-Math-CoT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Alelcv27/Llama3.1-8B-Math-CoT is an 8 billion parameter Llama 3.1 instruction-tuned model developed by Alelcv27, fine-tuned from unsloth/llama-3.1-8b-instruct-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a 32768 token context length, it is optimized for mathematical tasks and complex reasoning, leveraging its Chain-of-Thought capabilities.

Loading preview...