Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 8, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/MetaMath-Tulpar-7b-v2-Slerp is a 7 billion parameter language model created by Weyaxi, built upon the Mistral-7B-v0.1 architecture. This model is a merge of MetaMath-Mistral-7B and HyperbeeAI/Tulpar-7b-v2 using the Slerp method, designed to combine their respective strengths. It is optimized for tasks requiring robust mathematical reasoning and general language understanding, leveraging its merged parent models.

Loading preview...