AdaptLLM/medicine-LLM-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Dec 19, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

AdaptLLM/medicine-LLM-13B is a 13 billion parameter language model developed by AdaptLLM, based on the LLaMA-1 architecture. It is continually pre-trained on domain-specific biomedical corpora using a novel reading comprehension method to enhance domain knowledge while preserving question-answering abilities. This model is specifically optimized for tasks within the biomedicine domain, demonstrating strong performance comparable to much larger domain-specific models.

Loading preview...