Kukedlc/neuronal-7b-Mlab

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Kukedlc/neuronal-7b-Mlab is a 7 billion parameter language model created by Kukedlc, formed by merging mlabonne/NeuralDaredevil-7B and mlabonne/NeuralHermes-2.5-Mistral-7B using the slerp method. This model leverages the strengths of its base components, offering a 4096-token context length. It is designed for general text generation tasks, combining the capabilities of its constituent models.

Loading preview...

Model Overview

Kukedlc/neuronal-7b-Mlab is a 7 billion parameter language model developed by Kukedlc. It is a merged model, combining two distinct base models: mlabonne/NeuralDaredevil-7B and mlabonne/NeuralHermes-2.5-Mistral-7B. This merge was performed using the LazyMergekit tool, specifically employing the slerp (spherical linear interpolation) merge method.

Key Characteristics

  • Architecture: Based on the Mistral-7B family, inheriting its efficient design.
  • Parameter Count: 7 billion parameters, offering a balance between performance and computational requirements.
  • Context Length: Supports a context window of 4096 tokens.
  • Merge Method: Utilizes the slerp method with specific layer and parameter weighting, as detailed in the configuration, to blend the capabilities of its source models.

Intended Use Cases

This model is suitable for a variety of general-purpose text generation tasks, benefiting from the combined strengths of NeuralDaredevil-7B and NeuralHermes-2.5-Mistral-7B. Developers can integrate it into applications requiring conversational AI, content creation, or other language-based functionalities.