YangWu001/intervention_chinese
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 20, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

YangWu001/intervention_chinese is a 1.5 billion parameter Qwen2-based causal language model developed by Yang Wu. Fine-tuned for biomedical research, it specializes in intervention studies, clinical trials, and medical research assistance. This model acts as a proactive research assistant, offering bilingual support in Chinese and English for tasks like drafting research content, analyzing data, and providing insights into research methodology. It features a 32768-token context length and is optimized for academic and clinical research workflows.

Loading preview...