mlabonne/NeuralPipe-7B-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 27, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralPipe-7B-ties is a 7 billion parameter language model developed by mlabonne, created by merging OpenPipe/mistral-ft-optimized-1218 and mlabonne/NeuralHermes-2.5-Mistral-7B using the TIES merging method. This model is optimized for general reasoning and language understanding tasks, achieving an average score of 71.55 on the Open LLM Leaderboard. It leverages a 4096-token context length, making it suitable for a variety of conversational and analytical applications.

Loading preview...