semarmehdi/TinyLlama-1.1B-LoRA-Finetuned-2
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 8, 2026Architecture:Transformer Loading

semarmehdi/TinyLlama-1.1B-LoRA-Finetuned-2 is a 1.1 billion parameter language model, fine-tuned from the TinyLlama architecture. This model is shared by semarmehdi and is designed for general language understanding and generation tasks. With a context length of 2048 tokens, it offers a compact yet capable solution for various NLP applications. Its small size makes it suitable for resource-constrained environments or applications requiring efficient inference.

Loading preview...