darkc0de/BuddyGlassKilledBonziBuddyV2
darkc0de/BuddyGlassKilledBonziBuddyV2 is a 24 billion parameter language model created by darkc0de, merged using the TIES method. It is based on huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated and incorporates components from TheDrummer/Cydonia-24B-v2 and cognitivecomputations/Dolphin3.0-Mistral-24B. This model offers a 32768 token context length, combining the strengths of its constituent models for general-purpose language tasks.
Loading preview...
Overview
darkc0de/BuddyGlassKilledBonziBuddyV2 is a 24 billion parameter language model resulting from a merge of several pre-trained models. It was created using the MergeKit tool, specifically employing the TIES merge method.
Merge Details
This model's architecture is based on huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated as its primary base. It integrates capabilities from two additional models:
- TheDrummer/Cydonia-24B-v2
- cognitivecomputations/Dolphin3.0-Mistral-24B
The merge configuration utilized a density and weight of 0.5 for each contributing model, with int8_mask enabled and normalize set to false, processed in float16 precision. This approach aims to combine the distinct features and strengths of its constituent models into a unified, more versatile language model.
Key Characteristics
- Parameter Count: 24 billion parameters.
- Context Length: Supports a context window of 32768 tokens.
- Merge Method: Utilizes the TIES (Trimmed, Iterative, and Selective) merging technique, which is designed to efficiently combine multiple models.
Potential Use Cases
Given its merged nature and the diverse origins of its components, BuddyGlassKilledBonziBuddyV2 is likely suitable for a range of general-purpose language generation and understanding tasks, benefiting from the combined knowledge and capabilities of its base models.