rwibawa/DeepSeek-R1-Medical-COT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 10, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
DeepSeek-R1-Medical-COT is an 8 billion parameter causal language model developed by rwibawa, fine-tuned from unsloth/deepseek-r1-distill-llama-8b-unsloth-bnb-4bit. It features a 32768-token context length and was trained using Unsloth and Huggingface's TRL library for accelerated training. This model is optimized for medical reasoning tasks, leveraging its base architecture for specialized applications in the healthcare domain.
Loading preview...