darkc0de/BuddyGlass_v0.2_Xortron7MethedUpSwitchedUp
The darkc0de/BuddyGlass_v0.2_Xortron7MethedUpSwitchedUp is an 8 billion parameter language model created by darkc0de, merged using the TIES method with Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2 as its base. This model integrates capabilities from mlabonne/NeuralDaredevil-8B-abliterated and mlabonne/Hermes-3-Llama-3.1-8B-lorablated. It is designed to combine the strengths of its constituent models, offering a versatile foundation for various generative AI tasks.
Loading preview...
Model Overview
The darkc0de/BuddyGlass_v0.2_Xortron7MethedUpSwitchedUp is an 8 billion parameter language model, built upon the Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2 base model. It was created using the TIES merge method via mergekit, combining the strengths of multiple pre-trained language models.
Key Merge Details
This model is a strategic merge of the following components:
- Base Model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
- Merged Models:
Each merged model contributed with a density and weight of 0.5, indicating an equal distribution of influence in the final merge. The TIES method was configured with normalize: false and int8_mask: true, and the model's dtype is float16.
Potential Use Cases
Given its merged architecture, BuddyGlass_v0.2_Xortron7MethedUpSwitchedUp is suitable for applications requiring a blend of capabilities from its constituent models. Developers can leverage this model for tasks where the combined strengths of a Llama-3.1-based model, enhanced by specialized ablations and LoRA-blended versions, are beneficial.