Neelectric/Llama-3.1-8B-Instruct_SafeGrad_mathv00.04
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SafeGrad_mathv00.04 is an 8 billion parameter instruction-tuned language model, fine-tuned by Neelectric from Meta's Llama-3.1-8B-Instruct. This model was trained using SFT with TRL and is optimized for general conversational tasks. It features a 32768 token context length, making it suitable for processing longer inputs.

Loading preview...