Overview
ReadyArt/C4-Broken-Tutu-24B is a 24 billion parameter language model developed by ReadyArt. It was created using the DARE TIES merge method, which combines multiple pre-trained models to synthesize their capabilities. The base model for this merge was ReadyArt/The-Omega-Directive-M-24B-v1.1.
Merge Details
This model is a composite of several distinct language models, each contributing to its overall performance. The models merged include:
- ReadyArt/Forgotten-Safeword-24B
- TroyDoesAI/BlackSheep-24B
- TheDrummer/Cydonia-24B-v4
- ReadyArt/Omega-Darker_The-Final-Directive-24B
Each constituent model was assigned an equal weight of 0.2 in the DARE TIES configuration, with a density parameter of 0.3. This merging strategy aims to consolidate the strengths of these individual models into a single, more robust entity.
Key Characteristics
- Parameter Count: 24 billion parameters.
- Merge Method: Utilizes the DARE TIES method for combining models.
- Constituent Models: Integrates capabilities from four distinct 24B models, including contributions from prominent community architects like TheDrummer and TroyDoesAI.
Intended Use
This model is suitable for applications requiring a versatile language model that benefits from the combined expertise embedded in its merged components. Its architecture suggests a balanced performance across various NLP tasks, rather than specialization in a single domain.