Neelectric/Llama-3.1-8B-Instruct_SFT_Math-220kv00.29
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 10, 2026Architecture:Transformer Cold
Neelectric/Llama-3.1-8B-Instruct_SFT_Math-220kv00.29 is an 8 billion parameter instruction-tuned language model developed by Neelectric, fine-tuned from meta-llama/Llama-3.1-8B-Instruct. It was specifically trained on the Neelectric/OpenR1-Math-220k_extended_Llama3_4096toks dataset, optimizing its performance for mathematical reasoning and problem-solving tasks. With a context length of 32768 tokens, this model is designed for applications requiring robust mathematical capabilities.
Loading preview...