kaitchup/Mayonnaise-4in1-03
kaitchup/Mayonnaise-4in1-03 is a 7 billion parameter causal language model developed by The Kaitchup, built using mergekit based on mistralai/Mistral-7B-v0.1. This model is a mixture of experts (MoE) created with a TIES-merging method, combining mncai/mistral-7b-dpo-v5, flemmingmiguel/MBX-7B, and BarryFutureman/NeuralTurdusVariant1-7B. It is designed for general English language tasks, leveraging its merged architecture to potentially enhance performance over its base models.
Loading preview...
Overview
kaitchup/Mayonnaise-4in1-03 is a 7 billion parameter causal language model developed by The Kaitchup. It is an English-language model built upon the mistralai/Mistral-7B-v0.1 architecture, utilizing a Mixture of Experts (MoE) approach. The model was constructed using the mergekit tool, specifically employing the TIES-merging method.
Key Capabilities
This model is a composite of several fine-tuned models, merged to potentially leverage their individual strengths. The merging configuration includes:
- Base Model:
mncai/mistral-7b-dpo-v5 - Merged Experts:
flemmingmiguel/MBX-7BandBarryFutureman/NeuralTurdusVariant1-7B, with specified densities and weights.
Unique Approach
The creation of Mayonnaise-4in1-03 is detailed in the article "The Mayonnaise: Rank First on the Open LLM Leaderboard with TIES-Merging", which outlines the recipe for achieving high performance through TIES-merging. This method aims to combine the knowledge and capabilities of multiple models into a single, more robust model. The model is licensed under Apache 2.0.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.