nicolay-r/qwen25-05b-multiclinsum-distil
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kLicense:mitArchitecture:Transformer0.0K Open Weights Warm
nicolay-r/qwen25-05b-multiclinsum-distil is a 0.5 billion parameter decoder-based language model, distilled from Qwen/Qwen2.5-0.5B-Instruct. It is specifically fine-tuned on the MultiClinSum dataset for multilingual clinical text summarization, supporting English, French, Portuguese, and Spanish. This model is optimized for generating concise summaries of clinical texts, leveraging rationales inferred by a larger Qwen2.5-72B-Instruct model during its training process. Its primary application is in clinical natural language processing tasks requiring efficient and accurate summarization.
Loading preview...