Kukedlc/NeuralMaxime-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Kukedlc/NeuralMaxime-7B-slerp is a 7 billion parameter language model created by Kukedlc, resulting from a slerp merge of mlabonne/AlphaMonarch-7B and mlabonne/NeuralMonarch-7B. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.17 on the Open LLM Leaderboard. It is suitable for a variety of natural language processing tasks, particularly those requiring robust reasoning and comprehension.
Loading preview...