Model Overview
Kukedlc/Fasciculus-Arcuatus-7B-slerp is a 7 billion parameter language model developed by Kukedlc. It is a product of a spherical linear interpolation (slerp) merge, combining the strengths of macadeliccc/MonarchLake-7B and Kukedlc/NeoCortex-7B-slerp using LazyMergekit. This merging technique aims to create a model with enhanced capabilities by blending the characteristics of its constituent models.
Key Capabilities & Performance
This model exhibits strong performance across a range of general language understanding and reasoning benchmarks, as evaluated on the Open LLM Leaderboard. Key results include:
- Avg. Score: 76.07
- AI2 Reasoning Challenge (25-Shot): 73.55
- HellaSwag (10-Shot): 88.95
- MMLU (5-Shot): 64.65
- TruthfulQA (0-shot): 72.53
- Winogrande (5-shot): 85.71
- GSM8k (5-shot): 71.04
These scores indicate a balanced performance across tasks requiring common sense reasoning, factual recall, and mathematical problem-solving.
When to Use This Model
Given its balanced benchmark performance, Fasciculus-Arcuatus-7B-slerp is well-suited for general-purpose applications where a 7B parameter model with solid reasoning and language generation capabilities is required. Its 4096 token context length supports moderate input and output sizes, making it versatile for various text-based tasks.