HPAI-BSC/Qwen2.5-Aloe-Beta-72B
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Dec 23, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Warm

HPAI-BSC/Qwen2.5-Aloe-Beta-72B is a 72.7 billion parameter open healthcare LLM developed by HPAI-BSC, built upon the Qwen2.5 architecture. It is fine-tuned on 20 medical tasks and 1.8 billion tokens of medical and general-purpose data, achieving state-of-the-art performance on various medical benchmarks. This model excels in medical question-answering, summarization, diagnosis, and treatment recommendations, making it suitable for research in specialized healthcare AI applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p