AdaptLLM/medicine-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 9, 2023License:llama2Architecture:Transformer0.1K Open Weights Cold

AdaptLLM/medicine-chat is a 7 billion parameter LLaMA-2-Chat-7B based model developed by AdaptLLM, specifically fine-tuned for the biomedicine domain. It utilizes a novel method of transforming large-scale pre-training corpora into reading comprehension texts to enrich domain knowledge while preserving prompting ability. This model is designed to excel in question answering and conversational tasks within the medical field, demonstrating competitive performance against much larger domain-specific models.

Loading preview...