darkc0de/BuddyGlassNeverSleeps-methheadmethod-v0.2

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Sep 18, 2024Architecture:Transformer0.0K Cold

The darkc0de/BuddyGlassNeverSleeps-methheadmethod-v0.2 is an 8 billion parameter language model, merged using the Della method with a Llama-3.1-8B base. This model combines components from Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2, mlabonne/NeuralDaredevil-8B-abliterated, and mlabonne/Hermes-3-Llama-3.1-8B-lorablated. It is designed for general text generation tasks, leveraging its merged architecture for potentially diverse response capabilities.

Loading preview...

Model Overview

The darkc0de/BuddyGlassNeverSleeps-methheadmethod-v0.2 is an 8 billion parameter language model created through a merge of several pre-trained models. It utilizes the Della merge method via mergekit to combine the strengths of its constituent models.

Merge Details

This model's foundation is Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2, which served as the base model. It integrates contributions from two additional models:

  • mlabonne/NeuralDaredevil-8B-abliterated
  • mlabonne/Hermes-3-Llama-3.1-8B-lorablated

The merge configuration applied specific parameters including normalize: true, int8_mask: true, density: 0.7, lambda: 1.1, and epsilon: 0.2, with a float16 dtype. This approach aims to synthesize the characteristics of the merged models into a cohesive new model.

Potential Use Cases

Given its merged nature from Llama-3.1-8B based models, this model is likely suitable for a range of general-purpose text generation tasks, including:

  • Creative writing and content generation
  • Conversational AI and chatbots
  • Text summarization and expansion