damerajee/Oot-v2_lll
damerajee/Oot-v2_lll is a 7 billion parameter language model created by damerajee, formed by merging mlabonne/Marcoro14-7B-slerp and Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp using a slerp merge method. This model is designed for general language understanding and generation tasks, leveraging the combined strengths of its constituent models. It features a 4096-token context length and achieves an average score of 72.73 on the Open LLM Leaderboard, indicating strong performance across various benchmarks including reasoning, common sense, and MMLU.
Loading preview...
Oot-v2_lll: A Merged 7B Language Model
Oot-v2_lll is a 7 billion parameter language model developed by damerajee, created through a slerp merge of two base models: mlabonne/Marcoro14-7B-slerp and Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp. This merging technique aims to combine the strengths of its components, offering a versatile model for various natural language processing tasks.
Performance Highlights
Evaluated on the Open LLM Leaderboard, Oot-v2_lll demonstrates solid performance across a range of benchmarks. Key scores include:
- Average: 72.73
- AI2 Reasoning Challenge (25-Shot): 69.28
- HellaSwag (10-Shot): 86.60
- MMLU (5-Shot): 64.96
- GSM8k (5-shot): 72.18
These results indicate its capability in reasoning, common sense understanding, and general knowledge tasks. The model supports a context length of 4096 tokens.
Usage
Developers can easily integrate Oot-v2_lll using the Hugging Face transformers library, with provided Python code examples for text generation. The model is configured to use bfloat16 for efficient computation.