chenjingshen/Llama3-8B-merge-biomed-wizard
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 18, 2026License:llama3Architecture:Transformer Cold

The chenjingshen/Llama3-8B-merge-biomed-wizard is an 8 billion parameter language model, a DARE-TIES merge of Llama3-8B-Instruct, NousResearch/Hermes-2-Pro-Llama-3-8B, and aaditya/Llama3-OpenBioLLM-8B. Developed by chenjingshen using MindNLP Wizard, this model is specifically optimized for biomedical and general reasoning tasks. It demonstrates strong performance across various benchmarks, including MMLU biomedical subsets and general reasoning tasks like GSM8K and Winogrande.

Loading preview...