leveldevai/MarcDareBeagle-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MarcDareBeagle-7B is a 7 billion parameter language model created by leveldevai, resulting from a slerp merge of flemmingmiguel/MarcMistral-7B and leveldevai/TurdusDareBeagle-7B. This model leverages the strengths of its constituent models, offering a balanced performance for general-purpose language generation tasks. It is designed for applications requiring a compact yet capable model, suitable for deployment where resource efficiency is important.

Loading preview...

Overview

MarcDareBeagle-7B is a 7 billion parameter language model developed by leveldevai. It is a product of a slerp merge using LazyMergekit, combining two distinct models: flemmingmiguel/MarcMistral-7B and leveldevai/TurdusDareBeagle-7B. This merging technique allows for the integration of features and capabilities from both base models, aiming to create a more robust and versatile language model.

Key Characteristics

  • Architecture: A merge of two 7B parameter models, inheriting characteristics from both Mistral-based and TurdusDareBeagle architectures.
  • Merge Method: Utilizes the slerp (spherical linear interpolation) method for merging, with specific parameter weighting applied to self-attention and MLP layers.
  • Parameter Configuration: The merge configuration specifies varying interpolation values for self_attn and mlp filters, with a fallback value for other tensors, indicating a fine-tuned approach to combining the models.

Usage

This model is suitable for general text generation tasks. Developers can easily integrate it into their Python projects using the transformers library, as demonstrated in the provided usage example. It supports standard text generation pipelines with configurable parameters like max_new_tokens, temperature, top_k, and top_p for controlling output creativity and coherence.