Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.10
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Cold
Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.10 is an 8 billion parameter instruction-tuned causal language model, fine-tuned by Neelectric from the Llama-3.1-8B-Instruct architecture. This model is specifically optimized for scientific and technical domains, having been trained on the Neelectric/MoT_science_Llama3_4096toks dataset. It excels in generating responses relevant to scientific inquiries and complex technical discussions, leveraging its 32768 token context length.
Loading preview...