liminerity/Blured-Ties-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Blured-Ties-7B is a 7 billion parameter language model developed by liminerity, created by merging liminerity/Blurstral-7b-slerp and diffnamehard/Mistral-CatMacaroni-slerp-uncensored-7B using the Ties-Merging method. Based on the Mistral-7B-v0.1 architecture, this model is designed for general text generation tasks with a 4096-token context length. Its unique merge configuration aims to combine the strengths of its constituent models.

Loading preview...

Model Overview

Blured-Ties-7B is a 7 billion parameter language model developed by liminerity. It is a product of merging two distinct models: liminerity/Blurstral-7b-slerp and diffnamehard/Mistral-CatMacaroni-slerp-uncensored-7B. This merge was performed using the LazyMergekit tool, specifically employing the Ties-Merging method.

Key Characteristics

  • Architecture: Built upon the mistralai/Mistral-7B-v0.1 base model.
  • Merge Method: Utilizes the "ties" merge method, with specific density and weight parameters applied to the constituent models.
  • Configuration: The merge process involved normalizing parameters and setting the dtype to bfloat16.

Intended Use Cases

This model is suitable for a variety of general text generation tasks, leveraging the combined capabilities of its merged components. Developers can integrate it into their applications using the provided Hugging Face transformers library example, which demonstrates how to load the model and generate text with a 4096-token context length.