ChaoticNeutrals/Cookie_7B
Cookie_7B is a 7 billion parameter merged language model developed by ChaoticNeutrals, built upon jeiku/SpaghettiOs_7B using the DARE TIES merge method. This model is designed to offer a balanced performance, combining assistant functionality with enhanced roleplaying and romantic interaction capabilities. It achieves an average score of 71.87 on the Open LLM Leaderboard, demonstrating reasonable logical reasoning and general language understanding.
Loading preview...
Overview
ChaoticNeutrals/Cookie_7B is a 7 billion parameter language model created through a merge of pre-trained models using the mergekit tool. It utilizes the DARE TIES merge method, with jeiku/SpaghettiOs_7B serving as its base model and incorporating jeiku/Rainbow_69_7B.
Key Capabilities
- Balanced Functionality: Designed to provide both general assistant capabilities and strong roleplaying (RP) abilities.
- Enhanced RP: Includes datasets specifically aimed at improving roleplaying and romantic interaction scenarios.
- Logical Reasoning: Described as a "reasonably logical model," suggesting competence in understanding and responding to complex prompts.
Performance Highlights
Evaluated on the Open LLM Leaderboard, Cookie_7B achieved an average score of 71.87. Notable scores include:
- HellaSwag (10-Shot): 87.57
- Winogrande (5-Shot): 81.37
- AI2 Reasoning Challenge (25-Shot): 69.71
- MMLU (5-Shot): 64.51
Good For
- Use cases requiring a balanced 7B model that can handle both general assistant tasks and more nuanced, interactive roleplaying.
- Applications involving romantic or conversational roleplay where enhanced interaction abilities are beneficial.