posicube/Llama2-chat-AYB-13B
posicube/Llama2-chat-AYB-13B is a 13 billion parameter Llama-2-chat-based language model developed by Posicube Inc. It was created by ensembling top-performing models across various benchmarks, aiming to maximize overall performance. This model achieved top ranks among 13B models on the Open LLM Leaderboard by effectively combining strengths from different specialized models, making it suitable for general-purpose chat applications requiring strong benchmark performance.
Loading preview...
Model Overview
posicube/Llama2-chat-AYB-13B is a 13 billion parameter language model developed by Posicube Inc., built upon the Llama-2-13b-chat backbone. This model was created with the hypothesis that ensembling top-ranking models from various benchmarks could maximize overall performance. It leverages the HuggingFace Transformers library and was fine-tuned using Orca-style and Alpaca-style datasets.
Key Capabilities & Performance
This model demonstrated strong performance, achieving the top rank among 13B models on the Open LLM Leaderboard as of October 3rd, 2023. Its evaluation scores are competitive across multiple benchmarks:
- ARC (25-shot): 63.48
- HellaSwag (10-shot): 84.87
- MMLU (5-shot): 59.59
- TruthfulQA (0-shot): 55.22
- Average Score: 65.78
Intended Use Cases
Given its strong general-purpose benchmark performance, this model is well-suited for a variety of chat-based applications where robust and balanced capabilities across different tasks are desired. Developers should, however, perform their own safety testing and tuning, as with all LLMs, due to potential for inaccurate, biased, or objectionable responses. The model is subject to the original Llama-2 license and usage restrictions.