Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.09
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.09 is an 8 billion parameter instruction-tuned language model, fine-tuned from Meta's Llama-3.1-8B-Instruct. This model specializes in scientific domain understanding and generation, having been trained on the Neelectric/MoT_science_Llama3_4096toks dataset. It is optimized for tasks requiring scientific knowledge and reasoning, leveraging its 32768 token context length.

Loading preview...