rbelanec/train_mnli_42_1773765555
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 17, 2026License:llama3.2Architecture:Transformer Warm

The rbelanec/train_mnli_42_1773765555 is a 1 billion parameter language model fine-tuned from meta-llama/Llama-3.2-1B-Instruct. This model is specifically optimized for Natural Language Inference (NLI) tasks, having been trained on the MNLI dataset. It demonstrates a validation loss of 0.2161, indicating its proficiency in classifying textual entailment relationships. Its primary strength lies in understanding and categorizing logical relationships between sentences.

Loading preview...