dphn/dolphin-2.2-mistral-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 29, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

The dphn/dolphin-2.2-mistral-7b is a 7 billion parameter language model based on the Mistral architecture. This model was identified as overfit and has been superseded by dolphin-2.2.1-mistral-7b. It is primarily noted for its status as a previous iteration in the Dolphin series, with a recommended upgrade to its successor for general use.

Loading preview...

Model Overview

The dphn/dolphin-2.2-mistral-7b is a 7 billion parameter language model built upon the Mistral architecture. This particular version has been identified as overfit during its development and has since been superseded by an improved iteration.

Key Information

  • Architecture: Mistral-7B
  • Parameters: 7 Billion
  • Context Length: 4096 tokens

Important Note

Users are strongly advised not to use this specific model version. It has been re-released and improved as dolphin-2.2.1-mistral-7b. The successor model addresses the overfitting issues present in this version and is recommended for all applications.