liminerity/Blurstral-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
liminerity/Blurstral-7b-slerp is a 7 billion parameter language model created by liminerity, merging Mistral-7B-v0.1 and Blur-7b-slerp-v0.1 using a slerp method. This model leverages a 4096-token context length and achieves an average score of 69.08 on the Open LLM Leaderboard, demonstrating balanced performance across various reasoning and language understanding tasks. It is suitable for general-purpose applications requiring a capable 7B model.
Loading preview...