Neelectric/Llama-3.1-8B-Instruct_SFT_mathfisher_v00.03
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold
Neelectric/Llama-3.1-8B-Instruct_SFT_mathfisher_v00.03 is an 8 billion parameter instruction-tuned language model developed by Neelectric, fine-tuned from Meta's Llama-3.1-8B-Instruct. This model specializes in mathematical reasoning and problem-solving, having been trained on the Neelectric/OpenR1-Math-220k_all_Llama3_4096toks dataset. With a 32768 token context length, it is optimized for tasks requiring robust mathematical capabilities.
Loading preview...