AgnivaSaha/model_harmful_lora

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Warm

AgnivaSaha/model_harmful_lora is a 1.5 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically generated and pushed to the Hub. Due to limited information in its model card, specific details regarding its architecture, training, and intended use cases are not provided.

Loading preview...

Model Overview

AgnivaSaha/model_harmful_lora is a 1.5 billion parameter language model, automatically generated and hosted on the Hugging Face Hub. It features a substantial context length of 32768 tokens, suggesting potential for processing longer sequences of text.

Key Characteristics

  • Parameter Count: 1.5 billion parameters.
  • Context Length: 32768 tokens, indicating capacity for extensive input sequences.
  • Model Type: A standard Hugging Face Transformers model.

Limitations and Recommendations

Currently, the model card provides limited specific details regarding its development, training data, intended applications, or performance benchmarks. Users should be aware that critical information such as the model's architecture, language(s) it supports, license, and fine-tuning origins are marked as "More Information Needed." Therefore, direct and downstream use cases, as well as potential biases, risks, and limitations, are not yet defined. It is recommended that users exercise caution and seek further information before deploying this model in any application.