Aikyam-Lab/CURE-MED-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 21, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
CURE-MED-1.5B by Aikyam Lab is a 1.5 billion parameter large language model, fine-tuned from Qwen1.5-1.5B-instruct, specifically designed for multilingual medical reasoning. It utilizes a curriculum-informed reinforcement learning framework to enhance logical correctness and language stability in healthcare applications across 13 languages, including underrepresented ones. Its primary use case is providing accurate responses to open-ended medical queries in diverse linguistic contexts.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–