abhinand/Llama-3-OpenBioMed-8B-dare-ties-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:llama3Architecture:Transformer Warm

Llama-3-OpenBioMed-8B-dare-ties-v1.0 is an 8 billion parameter language model based on the Llama 3 architecture, created by abhinand. This model is a merge of three specialized biomedical LLMs using the dare_ties method, focusing on enhanced performance in the biomedical domain. It integrates knowledge from Llama3-OpenBioLLM-8B, JSL-MedLlama-3-8B-v1.0, and WiNGPT2-Llama-3-8B-Base to provide comprehensive biomedical understanding. The model is optimized for tasks requiring deep knowledge of medical and biological concepts, offering a context length of 8192 tokens.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p