liminerity/Blur-7b-slerp-v1.41
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Blur-7b-slerp-v1.41 is a 7 billion parameter language model created by liminerity, formed by merging bardsai/jaskier-7b-dpo-v5.6 and liminerity/merge using the slerp method. This model achieves an average score of 75.98 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. It is particularly suitable for general-purpose applications requiring robust language generation and comprehension.
Loading preview...