EleutherAI/Mistral-7B-v0.1-nli-first-ft

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024Architecture:Transformer Cold

EleutherAI/Mistral-7B-v0.1-nli-first-ft is a 7 billion parameter language model developed by EleutherAI. This model is a fine-tuned version of Mistral-7B-v0.1, specifically adapted for Natural Language Inference (NLI) tasks. With a 4096-token context length, it is designed to excel at understanding and classifying logical relationships between text passages.

Loading preview...

Model Overview

This model, EleutherAI/Mistral-7B-v0.1-nli-first-ft, is a 7 billion parameter language model developed by EleutherAI. It is a fine-tuned variant of the original Mistral-7B-v0.1, specifically optimized for Natural Language Inference (NLI) tasks. The model operates with a context length of 4096 tokens.

Key Characteristics

  • Base Model: Fine-tuned from Mistral-7B-v0.1.
  • Parameter Count: 7 billion parameters.
  • Context Length: Supports a 4096-token context window.
  • Primary Focus: Specialized for Natural Language Inference (NLI).

Use Cases

Given its specialization, this model is primarily intended for applications requiring robust NLI capabilities. While specific training data and evaluation metrics are not detailed in the provided model card, its designation as an "nli-first-ft" model suggests its strength lies in tasks such as:

  • Determining entailment, contradiction, or neutrality between text pairs.
  • Logical reasoning over textual information.
  • Fact-checking and consistency verification in natural language.