Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 4, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_sciencefisher_v00.01 is an 8 billion parameter instruction-tuned causal language model developed by Neelectric. It is a fine-tuned version of Meta's Llama-3.1-8B-Instruct, specifically optimized for scientific domain tasks. This model leverages a 32768 token context length and is trained on the Neelectric/MoT_science_Llama3_4096toks dataset, making it suitable for applications requiring scientific reasoning and knowledge.

Loading preview...