s3nh/Noromaid-Aeryth-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Noromaid-Aeryth-7B is a 7 billion parameter language model created by s3nh, resulting from a SLERP merge of NeverSleep/Noromaid-7b-v0.2 and NeuralNovel/Aeryth-7B-v0.1. This model is designed for general language tasks, demonstrating balanced performance across various benchmarks including MMLU and HellaSwag. With a 4096 token context length, it offers a versatile foundation for applications requiring robust language understanding and generation.
Loading preview...