Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.04
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 17, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.04 is an 8 billion parameter instruction-tuned causal language model, fine-tuned by Neelectric from the Llama-3.1-8B-Instruct base model. It was specifically trained on the Neelectric/MoT_science_Llama3_4096toks dataset, making it optimized for scientific domain tasks. With a 32768 token context length, this model is designed for applications requiring deep understanding and generation within scientific contexts.

Loading preview...