grohitraj/Agri_ontologies_out

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Dec 9, 2025Architecture:Transformer Warm

The grohitraj/Agri_ontologies_out model is a 2.6 billion parameter language model, fine-tuned from unsloth/gemma-2-2b-bnb-4bit. Developed by grohitraj, this model was trained using the TRL framework with SFT, indicating a focus on specific task performance. Its architecture is optimized for efficient processing with an 8192-token context length, making it suitable for applications requiring specialized language understanding.

Loading preview...

Model Overview

The grohitraj/Agri_ontologies_out model is a 2.6 billion parameter language model, fine-tuned from the unsloth/gemma-2-2b-bnb-4bit base model. This model was developed by grohitraj and trained using the Transformer Reinforcement Learning (TRL) library, specifically employing Supervised Fine-Tuning (SFT) as its training procedure.

Key Characteristics

  • Base Model: Fine-tuned from unsloth/gemma-2-2b-bnb-4bit.
  • Training Framework: Utilizes the TRL library for training, with SFT (Supervised Fine-Tuning) method.
  • Parameter Count: Features 2.6 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports an 8192-token context window, enabling processing of moderately long inputs.

Usage and Development

This model is designed for text generation tasks, as demonstrated by the provided quick start example using the transformers pipeline. Developers can integrate it into their applications for various language-related tasks. The training environment included specific versions of key libraries such as PEFT 0.18.0, TRL 0.24.0, Transformers 4.57.1, and PyTorch 2.6.0, ensuring compatibility with these frameworks.