Weyaxi/Seraph-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 11, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
Seraph-7B is a 7 billion parameter language model developed by Weyaxi, built upon the Mistral-7B-v0.1 base model using a slerp merge of Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp and Q-bert/MetaMath-Cybertron-Starling. This model is instruction-tuned and achieves an average score of 71.86 on the Open LLM Leaderboard, demonstrating strong performance across various benchmarks including reasoning and common sense. It is suitable for general-purpose conversational AI and tasks requiring robust language understanding.
Loading preview...