liminerity/Blur-7b-slerp-v1.46
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

liminerity/Blur-7b-slerp-v1.46 is a 7 billion parameter language model created by liminerity, formed by merging liminerity/merge and bardsai/jaskier-7b-dpo-v5.6 using a slerp merge method. This model features a 4096-token context length and demonstrates strong general performance across various benchmarks, including reasoning, common sense, and language understanding. It is suitable for a wide range of general-purpose natural language processing tasks.

Loading preview...