Ppoyaa/StarMonarch-7B
StarMonarch-7B is a 7 billion parameter language model developed by Ppoyaa, created by merging mlabonne/AlphaMonarch-7B and Nexusflow/Starling-LM-7B-beta using LazyMergekit. This model features an 8k context window and demonstrates strong performance across various benchmarks, including an average score of 74.45 on the Open LLM Leaderboard. It is well-suited for general-purpose language understanding and generation tasks, leveraging the combined strengths of its constituent models.
Loading preview...
StarMonarch-7B Overview
StarMonarch-7B is a 7 billion parameter language model developed by Ppoyaa. It is a product of merging two distinct models, mlabonne/AlphaMonarch-7B and Nexusflow/Starling-LM-7B-beta, utilizing the LazyMergekit tool. This merging approach aims to combine the strengths of both base models to achieve enhanced performance.
Key Capabilities & Features
- Merged Architecture: Combines AlphaMonarch-7B and Starling-LM-7B-beta for potentially broader and more robust capabilities.
- Extended Context Window: Supports an 8k context window, allowing for processing longer inputs and generating more coherent, extended outputs.
- Strong Benchmark Performance: Achieves an average score of 74.45 on the Open LLM Leaderboard, with notable scores in:
- AI2 Reasoning Challenge (25-Shot): 71.25
- HellaSwag (10-Shot): 87.00
- MMLU (5-Shot): 65.48
- TruthfulQA (0-shot): 67.20
- Winogrande (5-shot): 82.16
- GSM8k (5-shot): 73.62
Ideal Use Cases
StarMonarch-7B is suitable for a variety of general-purpose natural language processing tasks, including:
- Text Generation: Creating coherent and contextually relevant text.
- Reasoning Tasks: Performing well on reasoning benchmarks like AI2 Reasoning Challenge and GSM8k.
- Question Answering: Leveraging its strong performance on TruthfulQA and MMLU.
- Conversational AI: Its 8k context window supports more extended and nuanced dialogues.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.