Nina2811aw/qwen-32B-medical
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Nina2811aw/qwen-32B-medical is a 32.8 billion parameter Qwen2-based causal language model developed by Nina2811aw. This model was finetuned using Unsloth and Huggingface's TRL library, indicating an optimization for efficient training. Its primary differentiator is its medical domain focus, making it suitable for specialized applications requiring medical knowledge.

Loading preview...