choco-conoz/TwinLlama-3.2-1B
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Jun 28, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

TwinLlama-3.2-1B is a 1 billion parameter language model developed by choco-conoz, supervised fine-tuned from unsloth/Llama-3.2-1B. This model is designed for general language understanding and generation tasks, leveraging its 32768 token context length for processing longer inputs. It is suitable for applications requiring a compact yet capable model for various text-based operations.

Loading preview...