guangyangnlp/Qwen3-4B-SFT-medical-1e-5
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 22, 2026License:otherArchitecture:Transformer Warm

The guangyangnlp/Qwen3-4B-SFT-medical-1e-5 is a 4 billion parameter language model, fine-tuned from the Qwen/Qwen3-4B architecture. This model is specifically specialized for medical applications, having been fine-tuned on the medical_o1_train dataset. It is designed to perform well in medical contexts, leveraging its base Qwen3 architecture with a 32K context length.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p