arcee-ai/Hermes-2-Pro-WizardMath-7B-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 14, 2024Architecture:Transformer Cold
Hermes-2-Pro-WizardMath-7B-SLERP is a 7 billion parameter language model created by arcee-ai, resulting from a SLERP merge of NousResearch/Hermes-2-Pro-Mistral-7B and WizardLM/WizardMath-7B-V1.1. This model is specifically designed to combine strong general instruction following with enhanced mathematical reasoning capabilities, making it suitable for tasks requiring both logical problem-solving and broad conversational understanding. It features a 4096-token context length and leverages a V-shaped SLERP curve to balance the strengths of its constituent models.
Loading preview...