FlyPig23/Llama3.2-3B_Paper_Impact_SFT
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Cold

FlyPig23/Llama3.2-3B_Paper_Impact_SFT is a 3.2 billion parameter language model fine-tuned from Meta's Llama-3.2-3B-Instruct. This model is specifically trained on the paper_impact_sft_train dataset, indicating an optimization for tasks related to analyzing or generating content about the impact of research papers. It leverages a 32K token context length, making it suitable for processing longer documents in its specialized domain.

Loading preview...