ray0rf1re/Nix2.5-plus
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Jan 31, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Nix2.5-plus is a merged language model created by ray0rf1re using the slerp method from mergekit. It combines ray0rf1re/Nix2.5 and ray0rf1re/Nix1.5, with Nix2.5 contributing approximately 72.5% to the merge. This model aims to balance the capabilities of its constituent models, offering potentially improved or distinct performance characteristics. It is suitable for general text generation tasks, inheriting strengths and limitations from its base models.

Loading preview...

Overview

Nix2.5-plus is a merged language model developed by ray0rf1re, created using the slerp (Spherical Linear Interpolation) method from mergekit. This model combines two existing models: ray0rf1re/Nix2.5 and ray0rf1re/Nix1.5.

Key Characteristics

  • Merge Method: Utilizes slerp for combining model weights, specifically designed to blend the characteristics of its base models.
  • Weight Distribution: The merge was performed with a t parameter of 0.275, meaning ray0rf1re/Nix2.5 contributes approximately 72.5% and ray0rf1re/Nix1.5 contributes about 27.5% to the final model.
  • Base Model: ray0rf1re/Nix2.5 served as the primary base for this slerp merge.
  • Inherited Training: The model leverages the training data of its constituent models; users should refer to the individual model cards for details on their respective datasets.

Usage and Limitations

Nix2.5-plus can be loaded and used with the transformers library for various natural language processing tasks. As a merged model, its performance and potential biases are directly inherited from ray0rf1re/Nix2.5 and ray0rf1re/Nix1.5. Thorough evaluation is recommended for specific applications, as merged models can sometimes exhibit unexpected behaviors or performance degradation in certain tasks compared to their individual components.