suhailult777/MedBrain-0.5B
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
MedBrain-0.5B by suhailult777 is a 0.5 billion parameter, custom-trained medical language model with a 32768 token context length. Originally trained in JAX/Flax and optimized for PyTorch, it is designed to provide accurate, structured, and context-aware responses for healthcare inquiries. This model excels at medical triage assistance, clinical handoff generation, and patient education by leveraging fine-tuning on a medical instruction corpus.
Loading preview...