chaoyi-wu/PMC_LLAMA_7B_10_epoch
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 17, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

PMC_LLAMA_7B_10_epoch is a 7 billion parameter LLaMA-based model developed by chaoyi-wu, fine-tuned specifically on biomedical literature from the PMC papers within the S2ORC dataset. This version is distinguished by being trained for 10 epochs, an increase from its predecessor, optimizing its performance for tasks related to scientific and medical text analysis. It is designed for applications requiring deep understanding and generation of content within the biomedical domain.

Loading preview...