fgewfskjfsd/II-Medical-7B-Preview
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026Architecture:Transformer Cold

II-Medical-7B-Preview is a 7.6 billion parameter medical reasoning model developed by fgewfskjfsd, fine-tuned from Qwen/Qwen2.5-7B-Instruct. It is specifically designed to enhance AI capabilities in the medical domain, excelling in medical question answering and complex reasoning tasks. The model was trained on a comprehensive dataset of medical knowledge and optimized using DAPO on hard-reasoning medical data, achieving an average score of 66.4 across ten medical QA benchmarks.

Loading preview...