Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 24, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-1-7B is a 7 billion parameter language model created by Weyaxi, formed by merging teknium/OpenHermes-2.5-Mistral-7B and Intel/neural-chat-7b-v3-1 using a ties merge. This model leverages the strengths of its base models, offering a 4096-token context length and achieving an average score of 67.84 on the Open LLM Leaderboard, making it suitable for general conversational AI and instruction-following tasks.

Loading preview...