Sheyko/TinyLlama-3.2-1B-LoRA-Finetuned-2
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Loading

Sheyko/TinyLlama-3.2-1B-LoRA-Finetuned-2 is a 1 billion parameter language model developed by Sheyko, fine-tuned from the TinyLlama base model. This model is designed for general language tasks, leveraging its compact size for efficient deployment. With a context length of 32768 tokens, it offers substantial capacity for processing longer inputs. Its LoRA finetuning suggests optimization for specific, yet unspecified, applications.

Loading preview...