zou-lab/BioMed-R1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jun 25, 2025License:llama3.1Architecture:Transformer0.0K Cold

The zou-lab/BioMed-R1-8B is an 8 billion parameter medical large language model with a 32768-token context length, developed by Zou Lab. It is specifically fine-tuned using supervised and reinforcement learning on reasoning-heavy and adversarial medical examples to enhance self-correction and backtracking. This model aims to improve medical reasoning capabilities and robustness against misleading information, achieving strong overall and adversarial performance among similarly sized biomedical LLMs.

Loading preview...