FiditeNemini/Unhinged-Author-70B

Warm
Public
70B
FP8
32768
4
Jan 28, 2025
Hugging Face
Overview

Model Overview

FiditeNemini/Unhinged-Author-70B is a 70 billion parameter language model developed by FiditeNemini, utilizing a 32768 token context length. This model was created through a merge process using MergeKit, specifically employing the TIES merge method.

Merge Details

The model's foundation is Steelskull/L3.3-MS-Nevoria-70b, which served as the base model for the merge. The primary model integrated into this merge is huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated. The TIES method, as described in the paper "TIES-Merging: Resolving Transfer Conflicts in Model Merging" (arXiv:2306.01708), was used to combine these models, aiming to leverage their respective strengths.

Configuration

The merge configuration specified a weight of 1 and density of 1 for the huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated model. The TIES merge method was applied with a density of 1, normalization enabled, and int8 masking. The model's dtype is bfloat16.

Potential Use Cases

Given its architecture and the models it integrates, Unhinged-Author-70B is suitable for a broad range of natural language processing tasks, including:

  • Text generation
  • Question answering
  • Summarization
  • General conversational AI