chaoyi-wu/PMC_LLAMA_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 12, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

PMC_LLAMA_7B is a 7 billion parameter LLaMA-based causal language model developed by chaoyi-wu. It has been fine-tuned on the PMC papers from the S2ORC dataset, specializing in biomedical and scientific text generation. This model is optimized for tasks requiring understanding and generation within the domain of scientific literature, particularly medical and biological research.

Loading preview...