CultriX/OmniTrixAI
CultriX/OmniTrixAI is a 7 billion parameter language model created by CultriX, formed by merging mlabonne/NeuralBeagle14-7B, FelixChao/WestSeverus-7B-DPO-v2, and CultriX/MergeTrix-7B-v2 using the DARE TIES merge method. This model is designed for general-purpose text generation tasks, leveraging the combined strengths of its constituent models. It offers a 4096-token context length, making it suitable for a variety of conversational and content creation applications.
Loading preview...
OmniTrixAI: A Merged 7B Language Model
OmniTrixAI is a 7 billion parameter language model developed by CultriX, created through a strategic merge of several high-performing models. This model leverages the DARE TIES merge method to combine the capabilities of:
- mlabonne/NeuralBeagle14-7B
- FelixChao/WestSeverus-7B-DPO-v2
- CultriX/MergeTrix-7B-v2
The base model for this merge is senseable/WestLake-7B-v2. The merging process involved specific density and weight parameters for each contributing model, aiming to synthesize their strengths into a cohesive and capable new model. OmniTrixAI is configured with float16 precision and includes int8_mask for optimized performance.
Key Capabilities
- General-purpose text generation: Designed to handle a wide array of conversational and content creation tasks.
- Leverages diverse training: Benefits from the varied datasets and fine-tuning approaches of its constituent models.
- Optimized for deployment: Configured with
float16andint8_maskfor efficient inference.
Good For
- Developers seeking a robust 7B model built from a combination of established models.
- Applications requiring a balance of performance and resource efficiency.
- Experimentation with merged model architectures for improved generalization.