BarryFutureman/WestLakeX-7B-EvoMerge
BarryFutureman/WestLakeX-7B-EvoMerge is a 7 billion parameter language model developed by BarryFutureman, resulting from a small-scale EvoMerge experiment. This model explores the impact of mutation strength on performance within the EvoMerge framework. It serves as an experimental model to evaluate evolutionary merging techniques rather than a general-purpose LLM.
Loading preview...
WestLakeX-7B-EvoMerge: An Experimental Model
WestLakeX-7B-EvoMerge is a 7 billion parameter model developed by BarryFutureman, created through a small-scale application of the EvoMerge technique. This model is primarily an experimental outcome, designed to investigate the effects of mutation strength within the evolutionary merging process.
Key Characteristics
- Experimental Focus: The model's creation was an experiment to assess whether higher mutation strength in EvoMerge improves performance.
- EvoMerge Application: It represents a practical application of the EvoMerge framework, providing insights into its behavior at a smaller scale.
- Performance Observation: Initial findings suggest that higher mutation strength, in this specific experiment, did not lead to improved performance, with average evaluation scores falling below initial population candidates. This highlights its role as a research artifact.
Good For
- Research into Evolutionary Merging: Ideal for researchers and developers interested in the EvoMerge technique and its parameters.
- Understanding Model Fusion: Provides a case study on how evolutionary algorithms can be applied to merge language models and the impact of specific algorithmic choices.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.