liminerity/Blurstral-7b-slerp

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

liminerity/Blurstral-7b-slerp is a 7 billion parameter language model created by liminerity, merging Mistral-7B-v0.1 and Blur-7b-slerp-v0.1 using a slerp method. This model leverages a 4096-token context length and achieves an average score of 69.08 on the Open LLM Leaderboard, demonstrating balanced performance across various reasoning and language understanding tasks. It is suitable for general-purpose applications requiring a capable 7B model.

Loading preview...

Model Overview

liminerity/Blurstral-7b-slerp is a 7 billion parameter language model developed by liminerity. It is a product of merging two base models: mistralai/Mistral-7B-v0.1 and liminerity/Blur-7b-slerp-v0.1, utilizing the slerp (spherical linear interpolation) merge method. This approach combines the strengths of its constituent models to offer a versatile language understanding and generation capability.

Key Capabilities & Performance

Blurstral-7b-slerp demonstrates competitive performance across a range of benchmarks, as evaluated on the Open LLM Leaderboard. Its key scores include:

  • Average Score: 69.08
  • AI2 Reasoning Challenge (25-Shot): 66.30
  • HellaSwag (10-Shot): 85.38
  • MMLU (5-Shot): 65.18
  • TruthfulQA (0-shot): 53.40
  • Winogrande (5-shot): 81.37
  • GSM8k (5-shot): 62.85

These results indicate a balanced proficiency in common sense reasoning, language understanding, multi-task accuracy, and mathematical problem-solving.

Usage Considerations

This model is designed for general text generation and understanding tasks. Its 7B parameter size and 4096-token context window make it suitable for applications where a moderately sized, efficient, and capable language model is required. Developers can integrate it using the Hugging Face transformers library, as demonstrated in the provided Python usage example.