ShubhamZoro/DeepSeek-R1-Medical-COT-FP16-CLEAN
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Aug 12, 2025Architecture:Transformer Cold

The ShubhamZoro/DeepSeek-R1-Medical-COT-FP16-CLEAN is an 8 billion parameter language model with a 32768 token context length. This model is a cleaned version of a DeepSeek-R1 variant, likely fine-tuned for medical applications, leveraging Chain-of-Thought (COT) reasoning. Its primary strength lies in processing and generating content related to medical contexts, making it suitable for specialized healthcare AI tasks.

Loading preview...