Mahmoud7Dev/qwen2.5-1.5b-medical-merged
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 22, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The Mahmoud7Dev/qwen2.5-1.5b-medical-merged model is an Arabic medical chat model, built upon the Qwen/Qwen2.5-1.5B-Instruct architecture. This 1.5 billion parameter model has been fine-tuned using LoRA on medical symptom and disease data, making it specialized for initial medical assistance queries in Arabic. It is designed to provide preliminary medical guidance without offering definitive diagnoses or treatments, leveraging its 32768 token context length for comprehensive interactions.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p