beberik/Nyxene-v3-11B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Dec 12, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

beberik/Nyxene-v3-11B is a 10.7 billion parameter language model developed by beberik, built using a complex merge of four base models including Intel/neural-chat-7b-v3-3-Slerp and AIDC-ai-business/Marcoroni-7B-v3. This model leverages a multi-stage mergekit process, combining different layer ranges and applying slerp interpolation with specific parameter weighting. It is designed for general language tasks, demonstrating competitive performance on the Open LLM Leaderboard with an average score of 70.72 across various benchmarks.

Loading preview...