beberik/Nyxene-11B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Dec 2, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
beberik/Nyxene-11B is a 10.7 billion parameter language model created by beberik, built using a sophisticated merge of several 7B parameter models including Starling-LM-7B-alpha, NeuralHermes-2.5-Mistral-7B, juanako-7b-UNA, and dolphin-2.1-mistral-7b. This model leverages a slerp merge method with specific parameter weighting to combine the strengths of its constituent models. It achieves an average score of 67.72 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks.
Loading preview...