EmbeddedLLM/Mistral-7B-Merge-14-v0.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 18, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

EmbeddedLLM/Mistral-7B-Merge-14-v0.2 is a 7 billion parameter Mistral-based language model created by EmbeddedLLM, resulting from an experimental merge of 14 different models using DARE TIES and Gradient SLERP techniques. This base model, with a 4096 token context length, demonstrates strong general performance, achieving an average score of 72.88 on the Open LLM Leaderboard, and is intended for further instruction fine-tuning.

Loading preview...