AurelPx/NeuralPipe-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

AurelPx/NeuralPipe-7B-slerp is a 7 billion parameter language model created by AurelPx, resulting from a slerp merge of OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B. This model leverages the strengths of its base components, offering a balanced performance profile for general-purpose text generation and instruction-following tasks. It is designed for applications requiring a capable 7B model with a 4096-token context window.

Loading preview...