CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Dec 7, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES is a 14 billion parameter language model based on the Qwen2.5 architecture, created by CombinHorizon through a TIES merge. This model integrates the Josiefied-Qwen2.5-14B-Instruct-abliterated-v4 model with the Qwen/Qwen2.5-14B base. It is designed to leverage the strengths of its merged components, offering enhanced performance for instruction-following tasks.

Loading preview...

Model Overview

This model, CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES, is a 14 billion parameter language model built upon the Qwen2.5 architecture. It was created using the TIES merge method from mergekit, combining the instruction-tuned capabilities of Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4 with the robust Qwen/Qwen2.5-14B as its base.

Key Characteristics

  • Architecture: Based on the Qwen2.5 family, known for strong general-purpose language understanding and generation.
  • Merge Method: Utilizes the TIES (Trimmed, Iterative, and Selective) merging technique, which aims to efficiently combine the strengths of multiple pre-trained models.
  • Base Model: Leverages Qwen/Qwen2.5-14B, providing a solid foundation for diverse language tasks.
  • Instruction Tuning: Incorporates Josiefied-Qwen2.5-14B-Instruct-abliterated-v4, suggesting an emphasis on improved instruction-following and conversational abilities.

Potential Use Cases

  • Instruction-Following: Well-suited for tasks requiring precise adherence to given instructions.
  • General Text Generation: Capable of generating coherent and contextually relevant text across various domains.
  • Research and Experimentation: Provides a merged model for exploring the effectiveness of the TIES method on Qwen2.5 variants.