Weyaxi/MetaMath-Chupacabra-7B-v2.01-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 8, 2023License:apache-2.0Architecture:Transformer Open Weights Cold

Weyaxi/MetaMath-Chupacabra-7B-v2.01-Slerp is a 7 billion parameter language model created by Weyaxi, merged using the slerp method from MetaMath-Mistral-7B and Chupacabra-7B-v2.01. This model combines the mathematical reasoning capabilities of MetaMath with the general language understanding of Chupacabra, offering a balanced performance for tasks requiring both. It is built upon the Mistral-7B-v0.1 base model and supports a context length of 4096 tokens.

Loading preview...