EmbeddedLLM/Mistral-7B-Merge-14-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 18, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
EmbeddedLLM/Mistral-7B-Merge-14-v0.1 is a 7 billion parameter language model based on the Mistral architecture, created by EmbeddedLLM through an experimental merging process. This model was constructed by combining 14 different Mistral-7B variants using DARE TIES, followed by a Gradient SLERP merge with janai-hq/trinity-v1. It serves as a robust base model, designed to perform well across various tasks, though it benefits from further instruction fine-tuning.
Loading preview...