Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 8, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp is a 7 billion parameter language model created by Weyaxi, developed by merging MetaMath-Mistral-7B and Intel/neural-chat-7b-v3-2 using the Slerp method. This model combines the mathematical reasoning capabilities of MetaMath with the conversational strengths of neural-chat, making it suitable for tasks requiring both logical processing and natural language interaction. Its unique merging approach aims to leverage the best features of its constituent models for balanced performance.
Loading preview...