FlyPig23/Llama3.2-3B_Paper_Impact_citation_SFT_1ep
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Loading
FlyPig23/Llama3.2-3B_Paper_Impact_citation_SFT_1ep is a 3.2 billion parameter Llama 3.2-based instruction-tuned model, fine-tuned from meta-llama/Llama-3.2-3B-Instruct. It was trained for 1 epoch on the paper_impact_citations_train dataset, achieving a loss of 0.0836 on the evaluation set. This model is specialized for tasks related to paper impact and citation analysis, leveraging its fine-tuning on a relevant dataset.
Loading preview...