EmbeddedLLM/Mistral-7B-Merge-14-v0.4
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The EmbeddedLLM/Mistral-7B-Merge-14-v0.4 is a 7 billion parameter language model developed by EmbeddedLLM, created through a multi-stage merging process. It combines 14 models using DARE TIES, followed by a Gradient SLERP merge with Weyaxi/OpenHermes-2.5-neural-chat-v3-3-openchat-3.5-1210-Slerp. This experimental model achieves an average score of 71.19 on the Open LLM Leaderboard, demonstrating solid performance across various benchmarks including ARC, HellaSwag, MMLU, TruthfulQA, Winogrande, and GSM8K, and is suitable for general language tasks.
Loading preview...