ChaoticNeutrals/Eris-Lelantacles-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024License:otherArchitecture:Transformer0.0K Cold

ChaoticNeutrals/Eris-Lelantacles-7b is a 7 billion parameter language model created by ChaoticNeutrals, merged using the SLERP method from Nitral-AI/Eris-Beach_Day-7b and Nitral-AI/Lelanta-lake-7b. This model leverages a 4096 token context length and is designed for general language generation tasks, combining characteristics from its constituent models. Its unique merging approach aims to blend the strengths of its base components for diverse applications.

Loading preview...

Model Overview

ChaoticNeutrals/Eris-Lelantacles-7b is a 7 billion parameter language model developed by ChaoticNeutrals. This model was created using the SLERP merge method, combining two distinct base models: Nitral-AI/Eris-Beach_Day-7b and Nitral-AI/Lelanta-lake-7b. The merge configuration specifically applied varying t parameters across self-attention and MLP layers, indicating a nuanced blending strategy rather than a simple average.

Key Characteristics

  • Architecture: A merged model derived from two 7B parameter base models.
  • Merge Method: Utilizes the SLERP (Spherical Linear Interpolation) method, which is known for producing more coherent and effective merges compared to simpler averaging techniques.
  • Context Length: Supports a context window of 4096 tokens.
  • Precision: The model was configured to use bfloat16 for its dtype during the merge process.

Intended Use Cases

This model is suitable for general language generation tasks where a blend of capabilities from its constituent models is desired. Developers looking for a 7B model with characteristics influenced by both "Eris-Beach_Day" and "Lelanta-lake" models, achieved through a sophisticated merging technique, may find this model particularly useful.