Kukedlc/NeuralMaxime-7B-slerp

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Kukedlc/NeuralMaxime-7B-slerp is a 7 billion parameter language model created by Kukedlc, resulting from a slerp merge of mlabonne/AlphaMonarch-7B and mlabonne/NeuralMonarch-7B. This model demonstrates strong general reasoning capabilities, achieving an average score of 76.17 on the Open LLM Leaderboard. It is suitable for a variety of natural language processing tasks, particularly those requiring robust reasoning and comprehension.

Loading preview...

Overview

Kukedlc/NeuralMaxime-7B-slerp is a 7 billion parameter language model developed by Kukedlc. It is a product of a slerp merge using LazyMergekit, combining the strengths of two base models: mlabonne/AlphaMonarch-7B and mlabonne/NeuralMonarch-7B.

Key Capabilities

  • General Reasoning: Achieves a strong average score of 76.17 on the Open LLM Leaderboard, indicating robust performance across various benchmarks.
  • Benchmark Performance:
    • AI2 Reasoning Challenge (25-Shot): 73.38
    • HellaSwag (10-Shot): 89.18
    • MMLU (5-Shot): 64.44
    • TruthfulQA (0-shot): 77.79
    • Winogrande (5-shot): 84.45
    • GSM8k (5-shot): 67.78

Good For

  • Applications requiring solid general-purpose language understanding and generation.
  • Tasks benefiting from a model with balanced performance across diverse reasoning and knowledge-based benchmarks.
  • Developers looking for a 7B model with competitive scores on the Open LLM Leaderboard, as detailed here.