Kukedlc/NeuralFusion-7b-Dare-Ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralFusion-7b-Dare-Ties is a 7 billion parameter language model created by Kukedlc, built using a DARE TIES merge of NeuralMaxime-7B-slerp, Fasciculus-Arcuatus-7B-slerp, and NeoCortex-7B-slerp, all based on mlabonne/Monarch-7B. This model demonstrates strong general reasoning capabilities, achieving an average score of 75.94 on the Open LLM Leaderboard across various benchmarks. It is designed for general-purpose language generation and understanding tasks, with a context length of 4096 tokens.

Loading preview...