Srini18/DeepSeek-R1-Medical-COT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 18, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
Srini18/DeepSeek-R1-Medical-COT is an 8 billion parameter Llama-based model developed by Srini18, fine-tuned from unsloth/deepseek-r1-distill-llama-8b-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. Its specific "Medical-COT" designation suggests an optimization for medical reasoning tasks, likely leveraging Chain-of-Thought capabilities.
Loading preview...