liminerity/Blur-7b-v1.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
liminerity/Blur-7b-v1.2 is a 7 billion parameter language model created by liminerity, formed by merging liminerity/Blured-Ties-7B and freecs/ThetaWave-7B using the TIES merging method. This model, based on mlabonne/NeuralBeagle14-7B, achieves an average score of 67.74 on the Open LLM Leaderboard, demonstrating strong general reasoning and language understanding capabilities. It is suitable for a variety of general-purpose natural language processing tasks.
Loading preview...