mayacinka/yam-jom-7B-ties
mayacinka/yam-jom-7B-ties is a 7 billion parameter language model created by mayacinka, formed by merging eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment26-7B using the TIES merging method. This model achieves an average score of 76.44 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding benchmarks. With a 4096-token context length, it is suitable for general-purpose text generation and instruction-following tasks.
Loading preview...
Model Overview
mayacinka/yam-jom-7B-ties is a 7 billion parameter language model developed by mayacinka. It was created by merging two existing models, eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment26-7B, utilizing the TIES merging method. The base model for this merge was yam-peleg/Experiment26-7B, with specific weight parameters applied to each component model (0.35 for eren23's model and 0.65 for yam-peleg's model).
Performance Highlights
This model has been evaluated on the Open LLM Leaderboard, achieving a notable average score of 76.44. Key benchmark results include:
- AI2 Reasoning Challenge (25-Shot): 73.21
- HellaSwag (10-Shot): 89.05
- MMLU (5-Shot): 64.77
- TruthfulQA (0-shot): 77.51
- Winogrande (5-shot): 84.53
- GSM8k (5-shot): 69.60
Usage Considerations
With its 7B parameters and a 4096-token context length, yam-jom-7B-ties is designed for general language understanding and generation tasks. Its balanced performance across various benchmarks suggests suitability for applications requiring robust reasoning and factual recall. The model supports bfloat16 data type for efficient inference.