mahiatlinux/ShadowDolph-7B-v1

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

ShadowDolph-7B-v1 by mahiatlinux is a 7 billion parameter language model with a 4096-token context length, created by merging 'merged1and2-and-dolphin' and 'YamShadow-7B' using a slerp merge method. This model is designed to combine the strengths of its constituent models, offering a versatile base for various natural language processing tasks. Its merged architecture aims to provide enhanced performance across general-purpose applications.

Loading preview...

ShadowDolph-7B-v1 Overview

ShadowDolph-7B-v1 is a 7 billion parameter language model developed by mahiatlinux, built upon a unique merging strategy. This model is a composite of two distinct base models: mahiatlinux/merged1and2-and-dolphin and automerger/YamShadow-7B. The integration was performed using a slerp (spherical linear interpolation) merge method via LazyMergekit, aiming to combine and balance the capabilities of its source models.

Key Characteristics

  • Merged Architecture: Combines merged1and2-and-dolphin and YamShadow-7B to leverage their respective strengths.
  • Parameter Count: Features 7 billion parameters, suitable for a range of generative and analytical tasks.
  • Context Length: Supports a context window of 4096 tokens, allowing for processing moderately long inputs.
  • Merge Configuration: Utilizes a specific slerp configuration with varying t values for self-attention and MLP layers, indicating a fine-tuned approach to layer-wise merging.

Usage and Application

This model is designed for general text generation and understanding tasks. Developers can easily integrate it into their projects using the Hugging Face transformers library, as demonstrated by the provided Python usage example. It supports standard text generation pipelines with configurable parameters like max_new_tokens, temperature, top_k, and top_p for controlling output creativity and diversity.