beberik/Nyxene-v1-11B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Dec 4, 2023License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

beberik/Nyxene-v1-11B is a 10.7 billion parameter language model created by beberik, built using a sophisticated merge of several 7B-class models including Starling-LM-7B-alpha and DPOpenHermes-7B. This model is specifically designed to enhance creative text generation, leveraging a unique merging strategy that combines different base models. It achieves an average score of 67.58 on the Open LLM Leaderboard, demonstrating capabilities across various reasoning and language understanding tasks.

Loading preview...