Tianye88/Qwen2.5-1.5B-Instruct-Medical-cpt-sft-v1
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 29, 2025License:mitArchitecture:Transformer Open Weights Warm

The Tianye88/Qwen2.5-1.5B-Instruct-Medical-cpt-sft-v1 is a 1.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is specifically fine-tuned for medical applications, leveraging a combination of medical domain pre-training (cpt) and supervised fine-tuning (sft). It is designed to excel in medical question-answering and related natural language processing tasks within the healthcare sector, offering a substantial context length of 131072 tokens.

Loading preview...