minpeter/Alpaca-Llama-3.2-1B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kLicense:llama3.2Architecture:Transformer Warm

minpeter/Alpaca-Llama-3.2-1B-Instruct is a 1 billion parameter instruction-tuned language model based on Meta's Llama-3.2-1B architecture. Fine-tuned by minpeter on the tatsu-lab/alpaca dataset, this model is designed for general instruction-following tasks. It leverages a 32768 token context length and is optimized for efficient deployment in applications requiring a compact yet capable LLM.

Loading preview...