lzw1008/Emollama-chat-13b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 21, 2024License:mitArchitecture:Transformer0.0K Open Weights Warm

Emollama-chat-13b is a 13 billion parameter instruction-following large language model developed by lzw1008, fine-tuned from Meta's LLaMA2-chat-13B. It is specifically designed for comprehensive affective analysis, including tasks like sentimental polarity, categorical emotions, sentiment strength, and emotion intensity regression. This model is part of the EmoLLMs series, optimized for detailed emotional understanding and instruction-based affective analysis.

Loading preview...

Emollama-chat-13b: Affective Analysis LLM

Emollama-chat-13b is a 13 billion parameter model from the EmoLLMs project, developed by lzw1008, focusing on comprehensive affective analysis. It is fine-tuned from the Meta LLaMA2-chat-13B foundation model using the full AAID instruction tuning data.

Key Capabilities

  • Affective Classification: Performs tasks such as sentimental polarity and categorical emotion identification.
  • Affective Regression: Capable of regression tasks like sentiment strength and emotion intensity scoring.
  • Instruction Following: Designed to follow instructions for various affective analysis tasks, as demonstrated by prompt examples for emotion intensity, sentiment strength, and sentiment/emotion classification.

Ethical Considerations

The developers acknowledge potential biases, incorrect predictions, and over-generalization risks inherent in LLMs, advising caution when applying the model to real-world affective analysis systems.

Usage

This model can be easily integrated into Python projects using the Hugging Face Transformers library, with examples provided for loading the tokenizer and model.

Good For

  • Developers and researchers requiring an LLM specifically fine-tuned for detailed emotional understanding.
  • Applications involving sentiment analysis, emotion detection, and intensity scoring based on textual input.
  • Instruction-based affective analysis tasks where precise emotional classification or regression is needed.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p