rbelanec/train_rte_42_1774791065
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026License:llama3.2Architecture:Transformer Loading

The rbelanec/train_rte_42_1774791065 model is a 1 billion parameter language model fine-tuned from meta-llama/Llama-3.2-1B-Instruct. It was specifically trained on the RTE (Recognizing Textual Entailment) dataset, indicating an optimization for tasks requiring natural language inference. With a context length of 32768 tokens, this model is designed for understanding and determining the relationship between text pairs, such as entailment, contradiction, or neutrality.

Loading preview...