Neelectric/Llama-3.1-8B-Instruct_SFT_MoTv00.02
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_MoTv00.02 is an 8 billion parameter instruction-tuned causal language model developed by Neelectric. It is a fine-tuned version of Meta's Llama-3.1-8B-Instruct, specifically trained on the Neelectric/MoT_all_Llama3_8192toks dataset. This model is optimized for conversational and instruction-following tasks, leveraging its 32,768 token context length for enhanced understanding and generation.

Loading preview...