darkc0de/BuddyGlassUncensored2025.3

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 2, 2025Architecture:Transformer0.0K Cold

darkc0de/BuddyGlassUncensored2025.3 is a 24 billion parameter language model created by darkc0de, merged using the TIES method with mistralai/Mistral-Small-24B-Instruct-2501 as its base. This model integrates capabilities from several specialized Mistral-based models, including Dolphin3.0-Mistral-24B and Cydonia-24B-v2. It is designed to combine diverse strengths from its constituent models, offering a versatile foundation for various instruction-following tasks.

Loading preview...

Model Overview

darkc0de/BuddyGlassUncensored2025.3 is a 24 billion parameter language model developed by darkc0de. It was created using the TIES merge method from mergekit, building upon mistralai/Mistral-Small-24B-Instruct-2501 as its base model.

Key Capabilities

This model is a composite of several specialized Mistral-based models, aiming to combine their individual strengths. The merge includes:

  • huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated
  • cognitivecomputations/Dolphin3.0-Mistral-24B
  • TheDrummer/Cydonia-24B-v2
  • huihui-ai/Arcee-Blitz-abliterated

By integrating these diverse models, BuddyGlassUncensored2025.3 is intended to offer a broad range of instruction-following and generative capabilities, leveraging the collective knowledge and fine-tuning of its components.

Good For

  • General-purpose instruction following: Benefits from the instruction-tuned nature of its base and merged models.
  • Exploratory use cases: Suitable for developers looking for a model that combines different behavioral characteristics from its varied merge components.
  • Research into model merging: Provides an example of a TIES merge configuration with multiple specialized models.