Kukedlc/Neural-Cosmic-Boy-7B-slerp
Kukedlc/Neural-Cosmic-Boy-7B-slerp is a 7 billion parameter language model developed by Kukedlc, built upon the Mistral-7B-v0.1 base architecture. This model is a merge of Neural-Cosmic-7B-slerp, NeuralLogic-7B-V, and SuperCombo, utilizing a ties merging method. It demonstrates strong general reasoning capabilities, achieving an average score of 74.08 on the Open LLM Leaderboard, including 70.48 on AI2 Reasoning Challenge and 64.92 on MMLU.
Loading preview...
Model Overview
Neural-Cosmic-Boy-7B-slerp is a 7 billion parameter language model created by Kukedlc, based on the Mistral-7B-v0.1 architecture. It is a merged model, combining three distinct models: Kukedlc/Neural-Cosmic-7B-slerp, Kukedlc/NeuralLogic-7B-V, and Kukedlc/SuperCombo. The merging process was performed using the ties method, despite the 'slerp' in its name, and configured with specific density and weight gradients for each component model.
Key Capabilities & Performance
This model exhibits solid performance across various benchmarks on the Open LLM Leaderboard, indicating its general utility for a range of tasks. Key scores include:
- Average Score: 74.08
- AI2 Reasoning Challenge (25-Shot): 70.48
- HellaSwag (10-Shot): 87.65
- MMLU (5-Shot): 64.92
- TruthfulQA (0-shot): 67.10
- Winogrande (5-shot): 82.00
- GSM8k (5-shot): 72.33
Use Cases
Given its balanced performance across reasoning, common sense, and language understanding tasks, Neural-Cosmic-Boy-7B-slerp is suitable for applications requiring a capable 7B model. Its scores suggest it can be effectively used for:
- General question answering
- Reasoning-based tasks
- Text generation and completion
- Educational applications requiring factual recall and logical inference