mlabonne/NeuralPipe-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 27, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mlabonne/NeuralPipe-7B-slerp is a 7 billion parameter language model created by mlabonne, merged using the slerp method from OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B. This model demonstrates strong general language understanding and reasoning capabilities, achieving an average score of 71.17 on the Open LLM Leaderboard. It is suitable for a wide range of natural language processing tasks, leveraging its 4096-token context length.

Loading preview...