darkc0de/XortronGlitched24B
darkc0de/XortronGlitched24B is a 24 billion parameter language model created by darkc0de, merged using the DELLA method with TroyDoesAI/BlackSheep-24B as its base. This model integrates capabilities from several 24B models including TheDrummer/Cydonia-24B-v2, huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated, cognitivecomputations/Dolphin3.0-Mistral-24B, and huihui-ai/Arcee-Blitz-abliterated. With a 32768 token context length, it aims to combine the strengths of its constituent models for diverse generative tasks.
Loading preview...
Model Overview
darkc0de/XortronGlitched24B is a 24 billion parameter language model developed by darkc0de. It was created using the DELLA merge method from mergekit, building upon TroyDoesAI/BlackSheep-24B as its foundational base model.
Key Capabilities
This model integrates the characteristics of several distinct 24B models, aiming to leverage their combined strengths. The merged components include:
TheDrummer/Cydonia-24B-v2huihui-ai/Mistral-Small-24B-Instruct-2501-abliteratedcognitivecomputations/Dolphin3.0-Mistral-24Bhuihui-ai/Arcee-Blitz-abliterated
The merging process utilized a normalized configuration with int8_mask enabled, suggesting an optimization for balanced performance across the diverse capabilities of its source models.
Good For
Given its composition from multiple specialized models, XortronGlitched24B is suitable for users seeking a versatile 24B model that combines the strengths of its constituent parts. It is particularly relevant for those interested in exploring the outcomes of advanced merging techniques like DELLA on a collection of high-performing base models.