Neelectric/Llama-3.1-8B-Instruct_SFT_sciencev00.01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 27, 2026Architecture:Transformer Cold
Neelectric/Llama-3.1-8B-Instruct_SFT_sciencev00.01 is an 8 billion parameter instruction-tuned language model developed by Neelectric, fine-tuned from Meta's Llama-3.1-8B-Instruct. It specializes in scientific domain tasks, having been trained on the Neelectric/MoT_science_Llama3_4096toks dataset. This model is optimized for generating responses relevant to scientific inquiries and discussions, leveraging its 32768 token context length.
Loading preview...