thaddickson/Delphi-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Delphi-7B-v1 is a 7.6 billion parameter reasoning model developed by Thaddeus Dickson, CEO of Xpio Health. It is a 6-model merge of Qwen 2.5 7B specialists, refined through multi-stage training including LoRA, SLERP blending, and voice SFT. This model is specifically built for healthcare cybersecurity, clinical operations, and cross-domain problem solving, excelling at providing direct, specific, and non-hedging responses with domain-expert voice.

Loading preview...