FiditeNemini/Unhinged-Author-70B
Hugging Face
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Jan 28, 2025Architecture:Transformer0.0K Warm

FiditeNemini/Unhinged-Author-70B is a 70 billion parameter language model created by FiditeNemini, merged using the TIES method with Steelskull/L3.3-MS-Nevoria-70b as its base. This model integrates huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated, leveraging a 32768 token context length. It is designed for general language tasks, benefiting from the combined strengths of its constituent models.

Loading preview...

Model Overview

FiditeNemini/Unhinged-Author-70B is a 70 billion parameter language model developed by FiditeNemini, utilizing a 32768 token context length. This model was created through a merge process using MergeKit, specifically employing the TIES merge method.

Merge Details

The model's foundation is Steelskull/L3.3-MS-Nevoria-70b, which served as the base model for the merge. The primary model integrated into this merge is huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated. The TIES method, as described in the paper "TIES-Merging: Resolving Transfer Conflicts in Model Merging" (arXiv:2306.01708), was used to combine these models, aiming to leverage their respective strengths.

Configuration

The merge configuration specified a weight of 1 and density of 1 for the huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated model. The TIES merge method was applied with a density of 1, normalization enabled, and int8 masking. The model's dtype is bfloat16.

Potential Use Cases

Given its architecture and the models it integrates, Unhinged-Author-70B is suitable for a broad range of natural language processing tasks, including:

  • Text generation
  • Question answering
  • Summarization
  • General conversational AI
Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p