vishnukv/newmerge
vishnukv/newmerge is a 7 billion parameter language model created by vishnukv, resulting from a merge of pre-trained models using the DARE TIES method. It is based on uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b and incorporates PetroGPT/WestSeverus-7B-DPO and vishnukv/WestSeverusJaskier. This model is designed to combine the strengths of its constituent models for general language generation tasks.
Loading preview...
Overview
vishnukv/newmerge is a 7 billion parameter language model developed by vishnukv. It was created using the mergekit tool, specifically employing the DARE TIES merge method.
Merge Method
The model's architecture is based on a merge of several pre-trained language models. The merging process utilized the DARE (Direct Acyclic Graph Ensemble) and TIES (Task-Independent Ensemble of Specialists) methods, which are designed to combine the capabilities of multiple models effectively. The base model for this merge was uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b.
Constituent Models
The newmerge model integrates knowledge and capabilities from the following specific models:
Key Capabilities
- Combines features from multiple specialized 7B models.
- Leverages the DARE TIES merging technique for potentially enhanced performance across various tasks.
Good For
- General language generation and understanding tasks.
- Experimentation with merged model architectures.