rbelanec/train_mrpc_42_1774791061
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026License:llama3.2Architecture:Transformer Cold

The rbelanec/train_mrpc_42_1774791061 model is a 1 billion parameter language model fine-tuned by rbelanec. It is based on the meta-llama/Llama-3.2-1B-Instruct architecture and specifically optimized for the MRPC (Microsoft Research Paraphrase Corpus) dataset. This model is designed for tasks requiring paraphrase detection, demonstrating a validation loss of 0.1740 on the evaluation set.

Loading preview...