Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 5, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties is a 7 billion parameter language model created by Weyaxi, formed by merging MetaMath-Mistral-7B and NeuralHermes-2.5-Mistral-7B using the TIES-merging technique. This model combines the mathematical reasoning capabilities of MetaMath with the general conversational and instruction-following strengths of NeuralHermes. It is designed for tasks requiring both robust mathematical problem-solving and broad language understanding within a 4096-token context window.
Loading preview...