Overview
pandego/my-first-blend is a 7 billion parameter language model developed by pandego, created through a merge of existing pre-trained models. It utilizes the task arithmetic merge method, building upon mistralai/Mistral-7B-Instruct-v0.2 as its foundational base model. The merge incorporates two distinct models: SanjiWatsuki/Kunoichi-DPO-v2-7B (with a weight of 0.4) and paulml/NeuralOmniWestBeaglake-7B (with a weight of 0.6).
Key Capabilities & Performance
This blended model aims to leverage the strengths of its constituent parts. Evaluation on the Open LLM Leaderboard shows a balanced performance:
- Average Score: 63.66
- AI2 Reasoning Challenge (25-Shot): 69.37
- HellaSwag (10-Shot): 83.03
- MMLU (5-Shot): 53.91
- TruthfulQA (0-shot): 70.70
- Winogrande (5-shot): 79.32
Use Cases
Given its general instruction-tuned base and merged components, pandego/my-first-blend is suitable for a variety of common NLP tasks where a 7B parameter model is appropriate. Its performance across reasoning, common sense, and truthfulness benchmarks suggests it can be applied to:
- General question answering
- Text generation and summarization
- Instruction-following applications
Users should consider its specific benchmark scores, particularly its 25.63 on GSM8k, when evaluating its suitability for complex mathematical reasoning tasks.