jackf857/Llama32-1b-Instruct-hh-sft-30
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Jan 16, 2026Architecture:Transformer Warm

The jackf857/Llama32-1b-Instruct-hh-sft-30 model is a 1 billion parameter instruction-tuned causal language model, fine-tuned from meta-llama/Llama-3.2-1B-Instruct. It features a 32768 token context length and was trained using the TRL library. This model is designed for general instruction-following tasks, leveraging its fine-tuned capabilities for conversational AI and text generation.

Loading preview...