EmbeddedLLM/Mistral-7B-Merge-14-v0.5
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

EmbeddedLLM/Mistral-7B-Merge-14-v0.5 is a 7 billion parameter language model developed by EmbeddedLLM, built upon the Mistral-7B-v0.1 architecture. This model is an experimental merge of 14 different models, further refined using the DARE TIES method with additional models like OpenHermes-2.5 and openchat-3.5. It achieves an average score of 71.96 on the Open LLM Leaderboard, demonstrating strong performance across various benchmarks including ARC, HellaSwag, MMLU, and GSM8K.

Loading preview...