unsloth/tinyllama
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The unsloth/tinyllama model is a reupload of the TinyLlama 1.1B-intermediate-step-1431k-3T model, a 1.1 billion parameter causal language model. It is specifically optimized by Unsloth for significantly faster and memory-efficient finetuning, achieving 3.9x faster training with 74% less memory usage compared to standard methods. This model is ideal for developers seeking to quickly and efficiently finetune a compact language model on resource-constrained hardware.
Loading preview...