Kukedlc/MyModelsMerge-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 24, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Kukedlc/MyModelsMerge-7b is a 7 billion parameter language model created by Kukedlc, resulting from a merge of eight distinct models using the LazyMergekit tool. This merge incorporates models like liminerity/M7-7b, Kukedlc/Neural4gsm8k, and several other Kukedlc-developed models, utilizing a DARE TIES merge method. The model is configured with specific weights and densities for each component, aiming to combine their strengths. It is designed for general language generation tasks, leveraging the combined capabilities of its constituent models.

Loading preview...

Overview

Kukedlc/MyModelsMerge-7b is a 7 billion parameter language model developed by Kukedlc. It is a composite model, created by merging eight different base models using the LazyMergekit tool. The merge process specifically employed the DARE TIES method.

Key Components and Configuration

This model integrates contributions from:

  • liminerity/M7-7b
  • Kukedlc/Neural4gsm8k
  • Kukedlc/Jupiter-k-7B-slerp
  • Kukedlc/NeuralMaxime-7B-slerp
  • Kukedlc/NeuralFusion-7b-Dare-Ties
  • Kukedlc/Neural-Krishna-Multiverse-7b-v3
  • Kukedlc/NeuTrixOmniBe-DPO
  • Kukedlc/NeuralSirKrishna-7b (also serving as the base model)

Each merged component is assigned specific weight and density parameters, indicating their contribution to the final model's characteristics. The configuration also specifies int8_mask: true, normalize: true, and dtype: bfloat16 for optimized performance and memory usage.

Usage

The model can be easily integrated into Python projects using the transformers library, with example code provided for text generation tasks.