Neelectric/Llama-3.2-1B-Instruct_SDFT_sciencev00.01
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Warm

Neelectric/Llama-3.2-1B-Instruct_SDFT_sciencev00.01 is a 1 billion parameter instruction-tuned language model with a 32768 token context length. Developed by Neelectric, this model is part of the Llama-3.2 family. Its specific fine-tuning (SDFT_sciencev00.01) suggests an optimization for scientific domain tasks, making it suitable for applications requiring specialized knowledge in science.

Loading preview...