Neelectric/Llama-3.1-8B-Instruct_SFT_math00.01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_math00.01 is an 8 billion parameter instruction-tuned language model, fine-tuned by Neelectric from Meta's Llama-3.1-8B-Instruct. It was specifically trained on the OpenR1-Math-220k_extended_Llama3_4096toks dataset using SFT, making it highly optimized for mathematical reasoning and problem-solving tasks. With a context length of 32768 tokens, this model excels in handling complex mathematical queries and generating accurate, detailed solutions.

Loading preview...