leveldevai/MarcBeagle-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MarcBeagle-7B is a 7 billion parameter language model developed by leveldevai, created through a slerp merge of flemmingmiguel/MarcMistral-7B and leveldevai/TurdusBeagle-7B. This model leverages the strengths of its constituent models, offering a 4096-token context length. It is designed for general language generation tasks, benefiting from the combined architectures of its merged predecessors.

Loading preview...

MarcBeagle-7B: A Merged Language Model

MarcBeagle-7B is a 7 billion parameter language model developed by leveldevai. It is constructed using a slerp merge method, combining two distinct models: flemmingmiguel/MarcMistral-7B and leveldevai/TurdusBeagle-7B. This merging technique aims to synthesize the capabilities of its base models, providing a versatile foundation for various natural language processing tasks.

Key Characteristics

  • Architecture: A blend of Mistral-based and TurdusBeagle architectures, achieved through a slerp merge.
  • Parameter Count: 7 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a context window of 4096 tokens, suitable for processing moderately long inputs.
  • Merging Method: Utilizes the slerp (spherical linear interpolation) merge method, specifically configured with varying interpolation values across different tensor filters (e.g., self_attn, mlp) to optimize the combined model's performance.

Usage and Application

MarcBeagle-7B is designed for general text generation and understanding. Its merged nature suggests a broad applicability, potentially excelling in areas where its constituent models showed strength. Developers can easily integrate it into their projects using the Hugging Face transformers library, leveraging its capabilities for tasks such as content creation, summarization, and conversational AI.