DeepKarkhanis/NeuralPipe-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 9, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

DeepKarkhanis/NeuralPipe-7B-slerp is a 7 billion parameter language model created by DeepKarkhanis, formed by merging OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B using a slerp method. This model leverages the strengths of its base components, offering a 4096-token context length. Its unique merging approach aims to combine the capabilities of a Mistral-finetuned model with a NeuralHermes variant, making it suitable for general-purpose conversational AI and instruction-following tasks.

Loading preview...