Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 9, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp is a 7 billion parameter language model created by Weyaxi, merged using the slerp method from teknium/OpenHermes-2.5-Mistral-7B and Intel/neural-chat-7b-v3-3. This model leverages a 4096-token context length and is designed for general conversational AI tasks. It demonstrates strong performance across various benchmarks, including an average score of 71.38 on the Open LLM Leaderboard, making it suitable for diverse applications requiring robust language understanding and generation.

Loading preview...