winninghealth/WiNGPT2-Llama-3-8B-Base
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

WiNGPT2-Llama-3-8B-Base is an 8 billion parameter large language model developed by winninghealth, based on the Llama 3 architecture. This model is specifically designed for the medical domain, integrating professional medical knowledge and data to provide intelligent healthcare services. It excels in medical question answering, diagnostic support, and general medical knowledge services, with a context length of 8192 tokens.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p