Neelectric/Llama-3.1-8B-Instruct_SFT_sciencev00.20
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 10, 2026Architecture:Transformer Cold

Neelectric/Llama-3.1-8B-Instruct_SFT_sciencev00.20 is an 8 billion parameter instruction-tuned causal language model, fine-tuned by Neelectric from the Meta Llama-3.1-8B-Instruct base model. It features a 32768 token context length and is specifically optimized for scientific domain tasks. This model excels at generating responses relevant to scientific inquiries and discussions.

Loading preview...