oof-baroomf/csrsef-thinking-20260325T021216Z-it01-pubmedqa
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm
The oof-baroomf/csrsef-thinking-20260325T021216Z-it01-pubmedqa model is a 4 billion parameter language model merged using the NuSLERP method, based on Qwen/Qwen3-4B-Instruct-2507. It integrates components from Qwen/Qwen3-4B-Thinking-2507 and a specialized PubMedQA instruction-tuned model. This model is designed for tasks requiring reasoning and knowledge extraction, particularly within the biomedical domain, leveraging its 32768 token context length.
Loading preview...