fearlessdots/Experimental_Orion-Nebula-10.7B-v0.1

TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The fearlessdots/Experimental_Orion-Nebula-10.7B-v0.1 is a 10.7 billion parameter language model, created by fearlessdots, utilizing a passthrough merge of the WizardLM-2-7B-abliterated model. With a 4096-token context length, this model is an experimental merge focused on exploring the effects of specific layer ranges from its base model. It is primarily suited for research and development in model merging techniques.

Loading preview...

Model Overview

The fearlessdots/Experimental_Orion-Nebula-10.7B-v0.1 is an experimental 10.7 billion parameter language model developed by fearlessdots. It was created using the mergekit tool with a passthrough merge method, combining specific layer ranges from the WizardLM-2-7B-abliterated model.

Merge Details

This model's unique characteristic lies in its merging strategy. It integrates two distinct layer ranges from the WizardLM-2-7B-abliterated model:

  • Layers 0 through 24
  • Layers 8 through 32

This configuration, using a passthrough merge method and float16 dtype, suggests an exploration into how overlapping and distinct layer contributions from a single base model can influence the resulting merged model's capabilities. The model has a context length of 4096 tokens.

Potential Use Cases

  • Research into Model Merging: Ideal for researchers and developers studying the impact of different merging strategies and layer combinations on model performance and characteristics.
  • Experimental Development: Suitable for exploring novel architectures derived from existing powerful base models like WizardLM-2-7B-abliterated.

This model is primarily a research artifact, offering insights into advanced model merging techniques rather than being optimized for specific end-user applications.