YazoPi/LlaMa3.2-1B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 13, 2026Architecture:Transformer Warm

YazoPi/LlaMa3.2-1B-Instruct is a 1 billion parameter instruction-tuned causal language model developed by YazoPi. With a context length of 32768 tokens, this model is designed for general-purpose conversational AI tasks. Its compact size makes it suitable for applications requiring efficient inference and deployment.

Loading preview...