Ujjwal-Tyagi/Baichuan-M2-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 30, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Baichuan-M2-32B is a 32.8 billion parameter medical-enhanced reasoning model developed by Baichuan AI, built upon Qwen2.5-32B. It features an innovative Large Verifier System and domain-specific fine-tuning on real-world medical questions. This model excels in medical reasoning tasks, achieving leading performance among open-source models on HealthBench, while maintaining strong general capabilities and supporting efficient 4-bit quantization for deployment.

Loading preview...