CultriX/CombinaTrix-7B is a 7 billion parameter language model created by CultriX, formed by merging mlabonne/NeuralBeagle14-7B, FelixChao/WestSeverus-7B-DPO-v2, and jsfs11/TurdusTrixBeagle-DARETIES-7B using the DARE TIES merge method. This model leverages a 4096-token context length and is designed to combine the strengths of its constituent models for general-purpose text generation tasks. It is built upon the senseable/WestLake-7B-v2 base model.
Loading preview...
Overview
CombinaTrix-7B is a 7 billion parameter language model developed by CultriX. It is a product of merging several existing 7B models: mlabonne/NeuralBeagle14-7B, FelixChao/WestSeverus-7B-DPO-v2, and jsfs11/TurdusTrixBeagle-DARETIES-7B. The merge was performed using the DARE TIES method via LazyMergekit, with senseable/WestLake-7B-v2 serving as the base model.
Key Characteristics
- Architecture: A merged model combining multiple 7B parameter models.
- Merge Method: Utilizes the DARE TIES merging technique.
- Base Model: Built on
senseable/WestLake-7B-v2. - Parameter Count: 7 billion parameters.
- Context Length: Supports a context window of 4096 tokens.
Usage
This model is suitable for general text generation tasks, benefiting from the combined capabilities of its merged components. Developers can integrate it using the Hugging Face transformers library, with provided Python code examples for quick setup and inference. Users are encouraged to check the CultriX Hugging Face Space for the latest benchmark results and performance insights.