TAUR-dev/rankalign-v6-gemma-2-2b-it-d0.15-e1-hc-b2d-dbl-all-fsx-sm0.1
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Apr 9, 2026Architecture:Transformer Warm

TAUR-dev/rankalign-v6-gemma-2-2b-it-d0.15-e1-hc-b2d-dbl-all-fsx-sm0.1 is a 2.6 billion parameter Gemma-2-2b-it model fine-tuned using the rankalign project. This model is specifically optimized for hypernym-concat-bananas-to-dogs-double-all tasks, focusing on semantic relationship understanding. It is designed for specialized applications requiring precise identification and generation of hypernyms within specific semantic domains. The model leverages a delta of 0.15 and was trained for one epoch.

Loading preview...