thrnn/qwen2.5-1.5b-medical-sft-dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026Architecture:Transformer Cold

The thrnn/qwen2.5-1.5b-medical-sft-dare model is a 1.5 billion parameter language model based on the Qwen2.5-1.5B-Instruct architecture, fine-tuned for medical applications. It was created using the Linear DARE merge method, combining the base Qwen model with a specialized medical SFT LoRA. This model is designed for tasks requiring medical domain knowledge, leveraging its 32768 token context length for comprehensive analysis.

Loading preview...