abhinand/BioMedGPT-LM-7B-sharded-bf16
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

BioMedGPT-LM-7B is a 7 billion parameter generative language model developed by PharMolix, fine-tuned from Llama2-7B-Chat specifically for the biomedical domain. It was trained on over 26 billion tokens from millions of biomedical papers in the S2ORC corpus. This model excels at biomedical question-answering tasks, outperforming or matching human performance and larger general-purpose models in this specialized field.

Loading preview...