TAUR-dev/rankalign-v6-gemma-2-2b-d0.15-e2-hc-b2d-dbl-all-nv1-ng1-vlo-fsx-lo0.1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Apr 6, 2026Architecture:Transformer Warm

TAUR-dev/rankalign-v6-gemma-2-2b-d0.15-e2-hc-b2d-dbl-all-nv1-ng1-vlo-fsx-lo0.1 is a fine-tuned Gemma-2-2B model from the rankalign project, featuring approximately 2.6 billion parameters and a context length of 8192 tokens. This model is specifically trained for hypernym-related tasks, focusing on identifying hierarchical relationships between concepts. Its training involved a unique configuration for preference loss and NLL weights, making it suitable for research and applications requiring precise semantic hierarchy understanding.

Loading preview...

Model Overview

This model, rankalign-v6-gemma-2-2b-d0.15-e2-hc-b2d-dbl-all-nv1-ng1-vlo-fsx-lo0.1, is a specialized fine-tuned checkpoint derived from the google/gemma-2-2b base model. It is part of the rankalign project, which focuses on aligning language models for specific linguistic tasks.

Key Training Details

  • Base Model: google/gemma-2-2b
  • Version: v6
  • Task: hypernym-concat-bananas-to-dogs-double-all, indicating a focus on hypernym identification across various categories.
  • Epochs: Trained for 2 epochs.
  • Delta: 0.15, a parameter likely related to the fine-tuning process.
  • Preference Loss Weight: 1
  • NLL Validator Weight: 1
  • NLL Generator Weight: 1
  • Validator Log-Odds: Enabled, suggesting a specific approach to validation during training.
  • Labeled-only Ratio: 0.1, indicating a portion of the training data was exclusively labeled.

Use Cases

This model is particularly well-suited for research and development in:

  • Hypernym Detection: Identifying 'is-a' relationships between words or concepts.
  • Semantic Hierarchy Understanding: Applications requiring a nuanced grasp of semantic categorization.
  • Linguistic Analysis: Exploring the effects of specific fine-tuning parameters on model performance in semantic tasks.