liminerity/Blur-7b-v1.21
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Blur-7b-v1.21 is a 7 billion parameter language model developed by liminerity, created by merging three existing models: udkai/Turdus, decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP, and liminerity/Blur-7b-v1.2. This merge model demonstrates strong average performance across various benchmarks, including reasoning, common sense, and language understanding tasks. With a 4096-token context length, it is suitable for general-purpose conversational AI and text generation applications.
Loading preview...