automerger/Experiment26Yamshadow-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Experiment26Yamshadow-7B is a 7 billion parameter language model created by automerger, resulting from an automated merge process orchestrated by Maxime Labonne. This model combines rwitz/experiment26-truthy-iter-0 with automerger/YamShadow-7B using the DARE TIES merge method, configured for bfloat16 precision and a 4096-token context length. Its unique characteristic lies in its creation via a specific merge configuration, aiming to leverage the strengths of its constituent models.

Loading preview...

Overview

Experiment26Yamshadow-7B is a 7 billion parameter language model developed through an automated merge process by automerger, with the configuration designed by Maxime Labonne. This model is a product of combining two distinct models, rwitz/experiment26-truthy-iter-0 and automerger/YamShadow-7B, using the DARE TIES merge method.

Key Characteristics

  • Automated Merge: Created using a specific DARE TIES merge configuration, indicating an experimental approach to model development.
  • Parameter Count: Features 7 billion parameters, offering a balance between performance and computational requirements.
  • Context Length: Supports a context window of 4096 tokens, suitable for various conversational and text generation tasks.
  • Precision: Utilizes bfloat16 data type, which can offer performance benefits on compatible hardware.

Usage

This model is designed for text generation tasks and can be easily integrated into projects using the Hugging Face transformers library. Developers can leverage its capabilities for generating human-like text based on provided prompts, as demonstrated by the example usage provided in its repository.