FlyPig23/Llama3.2-3B_Paper_Impact_media_SFT_1ep
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Cold
FlyPig23/Llama3.2-3B_Paper_Impact_media_SFT_1ep is a 3.2 billion parameter language model fine-tuned from Meta's Llama-3.2-3B-Instruct. This model is specifically adapted using the paper_impact_media_train dataset, focusing on tasks related to the impact of academic papers and media. It is designed for specialized applications requiring understanding and generation within this domain, leveraging its Llama 3.2 architecture and a 32768 token context length.
Loading preview...