FlyPig23/Llama3.2-3B_Paper_Impact_patent_SFT_1ep
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:otherArchitecture:Transformer Cold

FlyPig23/Llama3.2-3B_Paper_Impact_patent_SFT_1ep is a 3.2 billion parameter Llama 3.2-based instruction-tuned model. It is fine-tuned on the 'paper_impact_patents_train' dataset, specializing in tasks related to patent analysis and scientific paper impact. This model is optimized for understanding and generating content within the domain of patent and research literature, offering a context length of 32768 tokens.

Loading preview...