Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 10, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp is a 7 billion parameter language model created by Weyaxi, merged using the slerp method from MetaMath-Mistral-7B and OpenHermes-2.5-neural-chat-v3-3-Slerp. This model combines the mathematical reasoning capabilities of MetaMath with the conversational and instruction-following strengths of OpenHermes and neural-chat. It is designed for tasks requiring both robust mathematical problem-solving and general-purpose conversational AI, operating with a 4096-token context length.
Loading preview...