TAUR-dev/rankalign-v6-gemma-2-2b-it-d0.15-e2-hc-b2d-dbl-all-p0-nv1-ng1-fsx-sm0.1
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Apr 9, 2026Architecture:Transformer Warm
TAUR-dev/rankalign-v6-gemma-2-2b-it-d0.15-e2-hc-b2d-dbl-all-p0-nv1-ng1-fsx-sm0.1 is a 2.6 billion parameter instruction-tuned model fine-tuned from Google's Gemma-2-2b-it base model. Developed as part of the rankalign project, this model is specifically optimized for hypernym generation tasks, focusing on identifying broader categories for given concepts. It leverages a unique training methodology involving delta 0.15 and a NLL validator weight of 1, making it suitable for research into semantic hierarchy and linguistic relationships.
Loading preview...