darkc0de/BuddyGlassUncensored2025.4
darkc0de/BuddyGlassUncensored2025.4 is a 24 billion parameter language model created by darkc0de using the DARE TIES merge method. It is based on mistralai/Mistral-Small-24B-Instruct-2501 and integrates components from several other 24B models, including those from huihui-ai, TheDrummer, and cognitivecomputations. This model is designed to combine the strengths of its constituent models, offering a versatile foundation for various instruction-following tasks.
Loading preview...
Model Overview
darkc0de/BuddyGlassUncensored2025.4 is a 24 billion parameter language model developed by darkc0de. It was created using the DARE TIES merge method, a technique designed to combine the strengths of multiple pre-trained language models into a single, more capable model. The base model for this merge was mistralai/Mistral-Small-24B-Instruct-2501.
Merge Details
This model is a composite of four distinct 24B models, carefully selected to enhance its overall performance and capabilities. The merged components include:
huihui-ai/Mistral-Small-24B-Instruct-2501-abliteratedTheDrummer/Cydonia-24B-v2huihui-ai/Arcee-Blitz-abliteratedcognitivecomputations/Dolphin3.0-Mistral-24B
The DARE TIES method was applied with specific density and weight parameters for each contributing model, aiming to create a balanced and robust instruction-following model. The configuration also included int8_mask and float16 dtype for efficiency.
Potential Use Cases
Given its merged nature and the diverse origins of its components, BuddyGlassUncensored2025.4 is likely suitable for a broad range of applications requiring strong instruction-following capabilities, potentially benefiting from the combined knowledge and reasoning abilities of its constituent models.