SylvanL/ChatTCM-7B-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

SylvanL/ChatTCM-7B-SFT is a 7.6 billion parameter instruction-tuned language model developed by SylvanL, specifically fine-tuned for Traditional Chinese Medicine (TCM). It excels in translating classical Chinese medical texts, providing clinical diagnostic logic, offering comprehensive TCM knowledge Q&A, and enhancing NLP capabilities for TCM terminology. This model is the first fully open-source TCM large language model in China, covering datasets, training methods, and model weights.

Loading preview...