mvpmaster/NeuralDareDMistralPro-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
mvpmaster/NeuralDareDMistralPro-7b-slerp is a 7 billion parameter language model created by mvpmaster, formed by spherically interpolating mlabonne/NeuralDaredevil-7B and NousResearch/Hermes-2-Pro-Mistral-7B. This merged model leverages the strengths of its base components, offering a 4096-token context window. It is designed for general-purpose conversational AI and instruction-following tasks, combining diverse training data from its constituent models.
Loading preview...