BLACKBUN/llama-2-7b-pubmed-qa-211k
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 14, 2023Architecture:Transformer Cold

BLACKBUN/llama-2-7b-pubmed-qa-211k is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model is specifically optimized for question answering within the biomedical domain, leveraging the qiaojin/PubMedQA dataset. Its primary use case is to provide accurate and contextually relevant answers to questions based on medical literature, making it suitable for applications requiring specialized knowledge in healthcare and research.

Loading preview...