liminerity/Blur-7B-slerp-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 14, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

liminerity/Blur-7B-slerp-v0.1 is a 7 billion parameter language model created by liminerity, formed by merging OpenPipe/mistral-ft-optimized-1218 and mlabonne/Marcoro14-7B-slerp using a slerp merge method. This model leverages the strengths of its base components, offering a 4096 token context length. It achieves an average score of 72.40 on the Open LLM Leaderboard, demonstrating capabilities across various reasoning and language understanding tasks.

Loading preview...