williamtom-3010/qwen-health-undrwtr-sft-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 18, 2026Architecture:Transformer Cold

The williamtom-3010/qwen-health-undrwtr-sft-v1 is a 7.6 billion parameter language model based on the Qwen architecture. This model is fine-tuned for specific applications, likely within the health domain as suggested by its name, and supports a substantial context length of 32768 tokens. Its primary differentiator lies in its specialized fine-tuning, making it suitable for tasks requiring deep understanding or generation in its target domain.

Loading preview...