Kukedlc/NeuralKukedlc-7B-Labonned

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralKukedlc-7B-Labonned is a 7 billion parameter language model created by Kukedlc, resulting from a merge of mlabonne/NeuralBeagle14-7B and mlabonne/NeuralHermes-2.5-Mistral-7B. This model leverages a slerp merge method to combine the strengths of its base models, offering a balanced performance across general language understanding and generation tasks. With a 4096-token context length, it is suitable for a variety of conversational and text-based applications.

Loading preview...

NeuralKukedlc-7B-Labonned Overview

NeuralKukedlc-7B-Labonned is a 7 billion parameter language model developed by Kukedlc, created through a strategic merge of two prominent models: mlabonne/NeuralBeagle14-7B and mlabonne/NeuralHermes-2.5-Mistral-7B. This model utilizes a sophisticated slerp (spherical linear interpolation) merge method, specifically configured to blend the layers and parameters of its constituent models.

Key Characteristics

  • Architecture: Based on the Mistral-7B family, inheriting its efficient design and performance characteristics.
  • Merge Strategy: Employs a slerp merge, with specific parameter weighting applied to self_attn and mlp layers, indicating a fine-tuned approach to combining the base models' strengths.
  • Context Length: Supports a context window of 4096 tokens, enabling it to handle moderately long inputs and generate coherent, extended responses.
  • Precision: Configured to use bfloat16 data type, balancing performance and memory efficiency.

Intended Use Cases

This model is designed for general-purpose language tasks, benefiting from the combined capabilities of its merged predecessors. It is particularly well-suited for:

  • Conversational AI: Generating human-like text in dialogue systems.
  • Text Generation: Creating creative content, summaries, or expanding on given prompts.
  • Instruction Following: Responding to user instructions effectively, leveraging the instruction-tuned nature of its base models.

Developers can easily integrate NeuralKukedlc-7B-Labonned using the Hugging Face transformers library, with provided Python code examples for quick setup and inference.