Neelectric/Llama-3.1-8B-Instruct_SDFT_mathv00.09
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SDFT_mathv00.09 is an 8 billion parameter instruction-tuned language model developed by Neelectric, based on Meta's Llama-3.1-8B-Instruct. It is fine-tuned using the SDFT method on the OpenR1-Math-220k_all_SDFT_nr dataset, specializing it for mathematical reasoning and problem-solving tasks. With a 32K context length, this model is optimized for applications requiring robust mathematical capabilities.

Loading preview...