rbelanec/train_mnli_42_1775732963
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:llama3.2Architecture:Transformer Loading
The rbelanec/train_mnli_42_1775732963 model is a 1 billion parameter language model fine-tuned from meta-llama/Llama-3.2-1B-Instruct. It is specifically optimized for natural language inference tasks, having been trained on the MNLI dataset. This model demonstrates a validation loss of 0.1219 on the evaluation set, indicating its proficiency in determining entailment, contradiction, or neutrality between sentence pairs.
Loading preview...