zycalice/Qwen2.5-32B-Instruct_medical_mlp-down_resp
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 17, 2026Architecture:Transformer Cold

The zycalice/Qwen2.5-32B-Instruct_medical_mlp-down_resp model is a Qwen2.5-based instruction-tuned language model developed by zycalice. While specific parameters and context length are not detailed, its naming suggests a focus on medical applications, likely fine-tuned for medical language processing tasks. This model is intended for specialized use cases requiring medical domain understanding and response generation.

Loading preview...