shanchen/llama3-8B-slerp-biomed-chat-chinese
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 30, 2024License:llama3Architecture:Transformer0.0K Warm

shanchen/llama3-8B-slerp-biomed-chat-chinese is an 8 billion parameter language model merged from shanchen/llama3-8B-slerp-med-chinese and shenzhi-wang/Llama3-8B-Chinese-Chat using the slerp method. This model is specifically designed for biomedical chat applications in Chinese, combining medical domain knowledge with general Chinese conversational abilities. It leverages the Llama3 architecture and is optimized for bfloat16 precision.

Loading preview...