Tianye88/Qwen2.5-1.5B-Instruct-Medical-cpt-sft-v1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 29, 2025License:mitArchitecture:Transformer Open Weights Warm

The Tianye88/Qwen2.5-1.5B-Instruct-Medical-cpt-sft-v1 is a 1.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is specifically fine-tuned for medical applications, leveraging a combination of medical domain pre-training (cpt) and supervised fine-tuning (sft). It is designed to excel in medical question-answering and related natural language processing tasks within the healthcare sector, offering a substantial context length of 131072 tokens.

Loading preview...

Tianye88/Qwen2.5-1.5B-Instruct-Medical-cpt-sft-v1 Overview

This model is a specialized 1.5 billion parameter instruction-tuned language model built upon the robust Qwen2.5 architecture. Its primary distinction lies in its targeted training for the medical domain, incorporating both medical domain pre-training (cpt) and supervised fine-tuning (sft). This dual-phase training approach aims to enhance its understanding and generation capabilities for medical-specific language and tasks.

Key Capabilities

  • Medical Domain Specialization: Optimized for processing and generating text relevant to healthcare, medical research, and clinical contexts.
  • Instruction Following: Designed to accurately follow instructions for various natural language processing tasks.
  • Large Context Window: Features a substantial context length of 131072 tokens, enabling it to handle extensive medical texts and complex queries.

Good For

  • Medical Question Answering: Answering queries related to diseases, treatments, symptoms, and medical procedures.
  • Clinical Text Analysis: Processing and understanding electronic health records, medical reports, and research papers.
  • Healthcare Applications: Developing AI-powered tools for medical information retrieval, patient education, and clinical decision support where domain-specific knowledge is crucial.