FallenMerick/Smart-Lemon-Cookie-7B
FallenMerick/Smart-Lemon-Cookie-7B is a 7 billion parameter language model created by FallenMerick, merged using the TIES method from SanjiWatsuki/Silicon-Maid-7B, SanjiWatsuki/Kunoichi-7B, and KatyTheCutie/LemonadeRP-4.5.3, with MTSAIR/multi_verse_model as its base. This model offers a context length of 8192 tokens and achieves an average score of 68.16 on the Open LLM Leaderboard, demonstrating balanced performance across various reasoning and language understanding tasks. It is suitable for general-purpose applications requiring a capable 7B model derived from a blend of specialized base models.
Loading preview...
Smart-Lemon-Cookie-7B Overview
Smart-Lemon-Cookie-7B is a 7 billion parameter language model developed by FallenMerick. It was created using the TIES merge method from mergekit, combining several pre-trained models to achieve its capabilities. The base model for this merge was MTSAIR/multi_verse_model.
Key Merge Components
This model is a blend of:
- SanjiWatsuki/Silicon-Maid-7B
- SanjiWatsuki/Kunoichi-7B
- KatyTheCutie/LemonadeRP-4.5.3
Performance Highlights
Evaluated on the Open LLM Leaderboard, Smart-Lemon-Cookie-7B achieved an average score of 68.16. Notable scores include:
- AI2 Reasoning Challenge (25-Shot): 66.30
- HellaSwag (10-Shot): 85.53
- MMLU (5-Shot): 64.69
- TruthfulQA (0-shot): 60.66
- Winogrande (5-shot): 77.74
- GSM8k (5-shot): 54.06
This model provides a balanced performance profile, making it a versatile option for various natural language processing tasks. GGUF quantizations are available from FaradayDotDev and mradermacher for efficient local deployment.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.