s3nh/Noromaid-Aeryth-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Noromaid-Aeryth-7B is a 7 billion parameter language model created by s3nh, resulting from a SLERP merge of NeverSleep/Noromaid-7b-v0.2 and NeuralNovel/Aeryth-7B-v0.1. This model is designed for general language tasks, demonstrating balanced performance across various benchmarks including MMLU and HellaSwag. With a 4096 token context length, it offers a versatile foundation for applications requiring robust language understanding and generation.

Loading preview...

Model Overview

s3nh/Noromaid-Aeryth-7B is a 7 billion parameter language model developed by s3nh. It was created using the SLERP merge method by combining two base models: NeverSleep/Noromaid-7b-v0.2 and NeuralNovel/Aeryth-7B-v0.1. This merging technique allows for a blend of the characteristics of its constituent models.

Performance Highlights

Evaluated on the Open LLM Leaderboard, Noromaid-Aeryth-7B achieves an average score of 57.82. Key benchmark results include:

  • HellaSwag (10-Shot): 78.62
  • MMLU (5-Shot): 57.29
  • TruthfulQA (0-shot): 65.66
  • Winogrande (5-shot): 71.82
  • AI2 Reasoning Challenge (25-Shot): 56.74

While showing solid performance in general language understanding and common sense reasoning, its score on GSM8k (16.76) indicates it may not be optimized for complex mathematical reasoning tasks.

Use Cases

This model is suitable for a range of general-purpose language generation and understanding tasks where a balanced performance across various domains is desired. Its 7B parameter size and 4096 token context window make it a good candidate for applications requiring efficient processing and moderate context handling.