jeiku/Zephyr_beta_32k_7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

jeiku/Zephyr_beta_32k_7B is a 7 billion parameter language model created by jeiku, based on a merge of mistralai/Mistral-7B-Instruct-v0.2 and typeof/zephyr-7b-beta-lora. This model leverages the DARE TIES merge method to combine the strengths of its base components. It is designed for general instruction-following tasks, building upon the Mistral architecture.

Loading preview...

Overview

This model, jeiku/Zephyr_beta_32k_7B, is a 7 billion parameter language model developed by jeiku. It is a merged model, combining the capabilities of existing pre-trained language models using the mergekit tool.

Merge Details

The model was created using the DARE TIES merge method, which is designed to effectively combine different models. The base model for this merge was mistralai/Mistral-7B-Instruct-v0.2. This base was then merged with typeof/zephyr-7b-beta-lora.

Key Components

  • Base Model: mistralai/Mistral-7B-Instruct-v0.2
  • Merged Component: typeof/zephyr-7b-beta-lora

Intended Use

This model is suitable for applications requiring a 7B parameter model with instruction-following capabilities, benefiting from the combined characteristics of its constituent models. Its architecture is rooted in the Mistral family, known for strong performance in its size class.