pfnet/Preferred-MedLLM-Qwen-72B
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Mar 5, 2025License:qwenArchitecture:Transformer0.0K Cold

Preferred-MedLLM-Qwen-72B is a 72.7 billion parameter model developed by Preferred Networks, Inc., fine-tuned from Qwen/Qwen2.5-72B. It has undergone continued pretraining on an original corpus of medical-related text, specializing in medical knowledge. This model achieves superior performance on the Japanese medical licensing examination (IgakuQA) with a context length of 131072 tokens, outperforming models like GPT-4o and Qwen2.5-72B.

Loading preview...