darkc0de/BuddyGlassIsBonziBuddyUncensored

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 1, 2025Architecture:Transformer Cold

darkc0de/BuddyGlassIsBonziBuddyUncensored is a 24 billion parameter language model, merged using the TIES method from a Mistral-Small-24B-Instruct-2501 base. This model integrates capabilities from huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated, cognitivecomputations/Dolphin3.0-Mistral-24B, and TheDrummer/Cydonia-24B-v2, offering a combined set of features for diverse applications. With a 32768 token context length, it is designed for tasks requiring extensive contextual understanding and nuanced responses.

Loading preview...

Model Overview

darkc0de/BuddyGlassIsBonziBuddyUncensored is a 24 billion parameter language model created through a merge of several pre-trained models. It utilizes the TIES merge method and is built upon a mistralai/Mistral-Small-24B-Instruct-2501 base.

Key Capabilities

This model combines the strengths of three distinct models:

  • huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated: Contributes to the model's instruction-following and general language understanding.
  • cognitivecomputations/Dolphin3.0-Mistral-24B: Likely enhances conversational abilities and potentially ethical alignment, given its 'Dolphin' lineage.
  • TheDrummer/Cydonia-24B-v2: Adds further specialized knowledge or performance characteristics from its unique training.

Merge Details

The merge was configured with equal density and weight parameters (0.5) for each contributing model. The process used mergekit and specified int8_mask: true and dtype: float16 for efficient operation. This composite approach aims to leverage the best features of its constituent models into a single, versatile 24B parameter model with a 32768 token context length.