Samee-ur/NeuralPipe-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Samee-ur/NeuralPipe-7B-slerp is a 7 billion parameter language model created by Samee-ur, formed by merging OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B using a slerp merge method. This model leverages the strengths of its constituent models, offering a balanced performance profile. It is suitable for general-purpose text generation tasks, building upon the Mistral architecture.

Loading preview...