beberik/Nyxene-v2-11B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Dec 4, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

beberik/Nyxene-v2-11B is a 10.7 billion parameter language model created by beberik, built using a complex merge of several 7B models including Starling-LM-7B-alpha, DPOpenHermes-7B, una-cybertron-7b-v2, and loyal-piano-m7-cdpo. This model leverages a slerp merge method with specific layer and parameter weighting to combine the strengths of its base models. It is designed for general-purpose conversational AI and instruction following, demonstrating competitive performance across various benchmarks.

Loading preview...