oof-baroomf/csrsef-thinking-20260323T195339Z-it01-pubmedqa
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 24, 2026Architecture:Transformer Loading

The oof-baroomf/csrsef-thinking-20260323T195339Z-it01-pubmedqa model is a 4 billion parameter language model with a 32768 token context length, created by oof-baroomf. It is a merge of Qwen3-4B-Instruct-2507 and a specialized PubMedQA instruction-tuned model, utilizing the NuSLERP merge method. This model is specifically designed for tasks requiring reasoning and knowledge within the biomedical domain, particularly excelling in question answering related to medical literature.

Loading preview...