FlyPig23/Llama3.2-3B_Paper_Impact_code_SFT_1ep
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Cold

FlyPig23/Llama3.2-3B_Paper_Impact_code_SFT_1ep is a 3.2 billion parameter Llama 3.2-Instruct model fine-tuned by FlyPig23. This model is specifically trained on the paper_impact_code_train dataset for one epoch, demonstrating a low loss of 0.0870 on its evaluation set. It is optimized for tasks related to code generation or analysis, particularly within the context of academic paper impact.

Loading preview...