bluesky333/medphi2

TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Jun 9, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

MedPhi-2 is a 2.7 billion parameter Phi-2 model, further trained by bluesky333 for the biomedical domain. This clinical Large Language Model (LLM) is specifically designed for medical question answering, as proposed in the MedExQA paper. It excels at processing and generating English language responses relevant to medical contexts, leveraging its specialized fine-tuning.

Loading preview...

MedPhi-2: A Specialized Clinical LLM

MedPhi-2 is a 2.7 billion parameter language model, fine-tuned from the Microsoft Phi-2 architecture. Developed by bluesky333, this model is specifically adapted for the biomedical domain.

Key Capabilities

  • Medical Question Answering: Optimized for understanding and responding to medical queries.
  • Biomedical Domain Expertise: Enhanced knowledge and reasoning within clinical contexts.
  • English Language Support: Processes and generates text in English.

Origin and Purpose

MedPhi-2 was proposed and utilized in the MedExQA paper, which introduces a Medical Question Answering Benchmark with Multiple Explanations. The model's development is closely tied to this research, aiming to provide a robust tool for medical information retrieval and explanation generation. The associated MedExQA dataset is also available.

Good For

  • Applications requiring specialized medical knowledge.
  • Research in clinical natural language processing.
  • Developing systems for medical question answering and explanation generation.