darkc0de/BuddyGlassUncensored2025.6
darkc0de/BuddyGlassUncensored2025.6 is a 24 billion parameter language model created by darkc0de, merged using the DELLA method with mistralai/Mistral-Small-24B-Instruct-2501 as its base. This model integrates components from Dolphin3.0-Mistral-24B, Cydonia-24B-v2, and two abliterated Mistral-Small-24B-Instruct-2501 variants. It is designed to leverage the combined strengths of its constituent models, offering a broad range of general-purpose language generation capabilities.
Loading preview...
Model Overview
darkc0de/BuddyGlassUncensored2025.6 is a 24 billion parameter language model developed by darkc0de. It was created using the DELLA merge method, a technique designed to combine the strengths of multiple pre-trained language models into a single, more capable entity. The base model for this merge was mistralai/Mistral-Small-24B-Instruct-2501.
Key Merge Details
This model is a composite of several distinct models, each contributing to its overall performance:
- cognitivecomputations/Dolphin3.0-Mistral-24B
- TheDrummer/Cydonia-24B-v2
- huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated
- huihui-ai/Arcee-Blitz-abliterated
The DELLA merge method, as described in arXiv:2406.11617, was applied with specific density and weight parameters for each constituent model. The configuration also specifies int8_mask: true and dtype: float16, indicating optimizations for efficiency and precision.
Intended Use Cases
Given its foundation on a Mistral-Small-24B-Instruct base and the inclusion of diverse models like Dolphin3.0 and Cydonia, BuddyGlassUncensored2025.6 is suitable for a wide array of general-purpose natural language processing tasks. Its merged architecture aims to provide robust performance across instruction-following, creative text generation, and conversational AI applications.