anirvankrishna/model_harmful_lora_fused

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026Architecture:Transformer Cold

The anirvankrishna/model_harmful_lora_fused is a 1.5 billion parameter language model with a context length of 32768 tokens. This model is presented as a Hugging Face Transformers model, though specific architectural details, training data, and evaluation results are not provided in its current model card. Its primary characteristics and intended use cases are currently undefined, as the model card indicates "More Information Needed" across most sections.

Loading preview...

Model Overview

The anirvankrishna/model_harmful_lora_fused is a 1.5 billion parameter language model available on the Hugging Face Hub, designed for use with the transformers library. It features a substantial context length of 32768 tokens, suggesting potential for processing longer sequences of text.

Key Characteristics

  • Parameter Count: 1.5 billion parameters.
  • Context Length: Supports a context window of 32768 tokens.
  • Model Type: A Hugging Face Transformers model, indicating compatibility with the transformers ecosystem.

Current Status and Limitations

As per its model card, detailed information regarding its development, specific model type, language support, license, and finetuning origins is currently marked as "More Information Needed." Consequently, its intended direct or downstream uses, as well as potential biases, risks, and limitations, are not yet specified. Users are advised that further recommendations regarding its use are pending more comprehensive documentation.

Getting Started

While specific usage examples are not provided in the model card, it is expected to be integrated and used following standard Hugging Face transformers library practices once more details become available.