annavivin/tinyllama-indic-sentiment-full
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The annavivin/tinyllama-indic-sentiment-full is a 1.1 billion parameter language model developed by annavivin, fine-tuned from unsloth/tinyllama-chat-bnb-4bit. Optimized for sentiment analysis in Indic languages, this model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. It is designed for efficient and specialized natural language processing tasks focusing on sentiment understanding within Indic language contexts.
Loading preview...
TinyLlama Indic Sentiment Model
This model, developed by annavivin, is a 1.1 billion parameter language model specifically fine-tuned for sentiment analysis in Indic languages. It leverages the TinyLlama architecture and was built upon the unsloth/tinyllama-chat-bnb-4bit base model.
Key Capabilities
- Specialized Sentiment Analysis: Optimized for understanding and classifying sentiment within text written in Indic languages.
- Efficient Training: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
- Compact Size: With 1.1 billion parameters, it offers a balance between performance and computational efficiency, making it suitable for resource-constrained environments.
Good for
- Applications requiring sentiment analysis for Indic language content.
- Developers looking for a compact and efficient model for specialized NLP tasks.
- Projects where faster fine-tuning and deployment are critical.